cu cet syllabus for msc statistics
hello,
Below is the syllabus for M.Sc in Statistics:
1. Sequences and Series: Convergence of sequences of real numbers, Comparison, root and ratio tests for convergence of series of real numbers.
2. Differential Calculus: Limits, continuity and differentiability of functions of one and two variables. Rolle's theorem, mean value theorems, Taylor's theorem, indeterminate forms, maxima and minima of functions of one and two variables.
3. Integral Calculus: Fundamental theorems of integral calculus. Double and triple integrals, applications of definite integrals, arc lengths, areas and volumes.
4. Matrices: Rank, inverse of a matrix. systems of linear equations. Linear transformations, eigenvalues and eigenvectors. CayleyHamilton theorem, symmetric, skewsymmetric and orthogonal matrices.
5. Differential Equations: Ordinary differential equations of the first order of the form y' = f(x,y). Linear differential equations of the second order with constant coefficients.
6. Statistics Probability: Axiomatic definition of probability and properties, conditional probability, multiplication rule. Theorem of total probability. Bayes’ theorem and independence of events.
7. Random Variables: Probability mass function, probability density function and cumulative distribution functions, distribution of a function of a random variable. Mathematical expectation, moments and moment generating function. Chebyshev's inequality.
8. Standard Distributions: Binomial, negative binomial, geometric, Poisson, hypergeometric, uniform, exponential, gamma, beta and normal distributions. Poisson and normal approximations of a binomial distribution.
9. Joint Distributions: Joint, marginal and conditional distributions. Distribution of functions of random variables. Product moments, correlation, simple linear regression. Independence of random variables.
10. Sampling distributions: Chisquare, t and F distributions, and their properties.
Limit Theorems: Weak law of large numbers. Central limit theorem (i.i.d.with finite variance case only).
11. Estimation: Unbiasedness, consistency and efficiency of estimators, method of moments and method of maximum likelihood. Sufficiency, factorization theorem. Completeness, RaoBlackwell and LehmannScheffe theorems, uniformly minimum variance unbiased estimators. RaoCramer inequality. Confidence intervals for the parameters of univariate normal, two independent normal, and one parameter exponential distributions.
12. Testing of Hypotheses: Basic concepts, applications of NeymanPearson Lemma for testing simple and composite hypotheses. Likelihood ratio tests for parameters of univariate normal distribution.
all the best!!