Monday, May 4, 2020
The Happiest Moment of My Life free essay sample
Signalprocessing Principles of Communication $ â⬠¢ The communication process: Sources of information, communication channels, modulation process, and communication networks â⬠¢ Representation of signals and systems: Signals, Continuous Fourier transform, Sampling theorem, sequences, z-transform, convolution and correlation. â⬠¢ Stochastic processes: Probability theory, random processes, power spectral density, Gaussian process. â⬠¢ Modulation and encoding: % Basic modulation techniques and binary data transmission:AM, FM, Pulse Modulation, PCM, DPCM, Delta Modulation â⬠¢ Information theory: Information, entropy, source coding theorem, mutual information, channel coding theorem, channel capacity, rate-distortion theory. â⬠¢ Error control coding: linear bloc codes, cyclic codes, convolution codes $ % $ Course Material 1. Text: Simon Haykin, Communication systems, 4th edition, John Wiley Sons, Inc (2001) 2. References (a) B. P. Lathi, Modern Digital and Analog Communcations Systems, Oxford University Press (1998) (b) Alan V. Oppenheim and Ronald W. Schafer, Discrete-Time signal processing, Prentice-Hall of India (1989) (c) Andrew Tanenbaum, Computer Networks, 3rd edition, Prentice Hall(1998). (d) Simon Haykin, â⬠Digital Communication Systems,â⬠John Wiley Sons, Inc. We will write a custom essay sample on The Happiest Moment of My Life or any similar topic specifically for you Do Not WasteYour Time HIRE WRITER Only 13.90 / page % *Duration:* 14 Weeks Course Schedule $ Week 1:* Source of information; communication channels, modulation process and Communication Networks â⬠¢ Week 2-3:* Signals, Continuous Fourier transform, Sampling theorem â⬠¢ Week 4-5:* sequences, z-transform, convolution, correlation â⬠¢ Week 6:* Probability theory basics of probability theory, random processes â⬠¢ Week 7:* Power spectral density, Gaussian process â⬠¢ Week 8:* Modulation: amplitude, phase and frequency â⬠¢ Week 9:* Encoding of binary data, NRZ, NRZI, Manchester, 4B/5B % $ Week 10:* Characteristics of a link, half-duplex, full-duplex, Time division multiplexing, frequency division multiplexing â⬠¢ Week 11:* Information, entropy, source coding theorem, mutual information â⬠¢ Week 12:* channel coding theorem, channel capacity, rate-distortion theory â⬠¢ Week 13:* Coding: linear block codes, cyclic codes, convolution codes â⬠¢ Week 14:* Revision % Overview of the Course $ Target A udience: Computer Science Undergraduates who have not taken any course on Communication â⬠¢ Communication between a source and a destination requires a channel. A signal (voice/video/facsimile) is transmitted on a channel: Basics of Signals and Systems ââ¬â This requires a basic understanding of signals ? Representation of signals ââ¬â Each signal transmitted is characterised by power. ââ¬â The power required by a signal is best understood by frequency characteristics or bandwidth of the signal: ? Representation of the signal in the frequency domain Continuous Fourier transform % ââ¬â A signal trasmitted can be either analog or digital ? A signal is converted to a digital signal by ? st discretising the signal Sampling theorem Discrete-time Fourier transform ? Frequency domain interpretation of the signal is easier in terms of the Z-transform ? Signals are modi? ed by Communication media, the communication media are characterised as Systems ? The output to input relationship is characterised by a Transfer Function $ â⬠¢ Signal in communcation are characterised by Random variables ââ¬â Basics of Probability ââ¬â Random Variables and Random Processes ââ¬â Expectation, Autocorrelation, Autocovariance, Power Spectral Density % Analog Modulation Schemes ââ¬â AM, DSB-SC, SSB-SC, VSB-SC, SSB+C, VSB+C ââ¬â Frequency Division Muliplexing ââ¬â Power required in each of the above $ â⬠¢ Digital Modulation Schemes ââ¬â PAM, PPM, PDM (just mention last two) ââ¬â Quantisation ââ¬â PCM, DPCM, DM ââ¬â Encoding of bits: NRZ, NRZI, Manchester ââ¬â Power required for each of the en coding schemes â⬠¢ Information Theory ââ¬â Uncertainty, Entropy, Information ââ¬â Mutual information, Di? erential entropy ââ¬â Shannonââ¬â¢s source and channel coding theorems % $ ââ¬â Shannonââ¬â¢s information capacity theorem Analysis of Gaussian channels â⬠¢ Coding ââ¬â Repetition code ââ¬â Hamming codes ââ¬â Error detection codes: CRC %
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.