- Statistics - Discussion
- Z table
- Weak Law of Large Numbers
- Venn Diagram
- Variance
- Type I & II Error
- Trimmed Mean
- Transformations
- Ti 83 Exponential Regression
- T-Distribution Table
- Sum of Square
- Student T Test
- Stratified sampling
- Stem and Leaf Plot
- Statistics Notation
- Statistics Formulas
- Statistical Significance
- Standard normal table
- Standard Error ( SE )
- Standard Deviation
- Skewness
- Simple random sampling
- Signal to Noise Ratio
- Shannon Wiener Diversity Index
- Scatterplots
- Sampling methods
- Sample planning
- Root Mean Square
- Residual sum of squares
- Residual analysis
- Required Sample Size
- Reliability Coefficient
- Relative Standard Deviation
- Regression Intercept Confidence Interval
- Rayleigh Distribution
- Range Rule of Thumb
- Quartile Deviation
- Qualitative Data Vs Quantitative Data
- Quadratic Regression Equation
- Process Sigma
- Process Capability (Cp) & Process Performance (Pp)
- Probability Density Function
- Probability Bayes Theorem
- Probability Multiplecative Theorem
- Probability Additive Theorem
- Probability
- Power Calculator
- Pooled Variance (r)
- Poisson Distribution
- Pie Chart
- Permutation with Replacement
- Permutation
- Outlier Function
- One Proportion Z Test
- Odd and Even Permutation
- Normal Distribution
- Negative Binomial Distribution
- Multinomial Distribution
- Means Difference
- Mean Deviation
- Mcnemar Test
- Logistic Regression
- Log Gamma Distribution
- Linear regression
- Laplace Distribution
- Kurtosis
- Kolmogorov Smirnov Test
- Inverse Gamma Distribution
- Interval Estimation
- Individual Series Arithmetic Mode
- Individual Series Arithmetic Median
- Individual Series Arithmetic Mean
- Hypothesis testing
- Hypergeometric Distribution
- Histograms
- Harmonic Resonance Frequency
- Harmonic Number
- Harmonic Mean
- Gumbel Distribution
- Grand Mean
- Goodness of Fit
- Geometric Probability Distribution
- Geometric Mean
- Gamma Distribution
- Frequency Distribution
- Factorial
- F Test Table
- F distribution
- Exponential distribution
- Dot Plot
- Discrete Series Arithmetic Mode
- Discrete Series Arithmetic Median
- Discrete Series Arithmetic Mean
- Deciles Statistics
- Data Patterns
- Data collection - Case Study Method
- Data collection - Observation
- Data collection - Questionaire Designing
- Data collection
- Cumulative Poisson Distribution
- Cumulative plots
- Correlation Co-efficient
- Co-efficient of Variation
- Cumulative Frequency
- Continuous Series Arithmetic Mode
- Continuous Series Arithmetic Median
- Continuous Series Arithmetic Mean
- Continuous Uniform Distribution
- Comparing plots
- Combination with replacement
- Combination
- Cluster sampling
- Circular Permutation
- Chi Squared table
- Chi-squared Distribution
- Central limit theorem
- Boxplots
- Black-Scholes model
- Binomial Distribution
- Beta Distribution
- Best Point Estimation
- Bar Graph
- Arithmetic Range
- Arithmetic Mode
- Arithmetic Median
- Arithmetic Mean
- Analysis of Variance
- Adjusted R-Squared
- Home
Selected Reading
- Who is Who
- Computer Glossary
- HR Interview Questions
- Effective Resume Writing
- Questions and Answers
- UPSC IAS Exams Notes
Statistics - Adjusted R-Squared
R-squared measures the proportion of the variation in your dependent variable (Y) explained by your independent variables (X) for a pnear regression model. Adjusted R-squared adjusts the statistic based on the number of independent variables in the model.${R^2}$ shows how well terms (data points) fit a curve or pne. Adjusted ${R^2}$ also indicates how well terms fit a curve or pne, but adjusts for the number of terms in a model. If you add more and more useless variables to a model, adjusted r-squared will decrease. If you add more useful variables, adjusted r-squared will increase.
Adjusted ${R_{adj}^2}$ will always be less than or equal to ${R^2}$. You only need ${R^2}$ when working with samples. In other words, ${R^2}$ isn t necessary when you have data from an entire population.
Formula
${R_{adj}^2 = 1 - [frac{(1-R^2)(n-1)}{n-k-1}]}$
Where −
${n}$ = the number of points in your data sample.
${k}$ = the number of independent regressors, i.e. the number of variables in your model, excluding the constant.
Example
Problem Statement −
A fund has a sample R-squared value close to 0.5 and it is doubtlessly offering higher risk adjusted returns with the sample size of 50 for 5 predictors. Find Adjusted R square value.
Solution −
Sample size = 50 Number of predictor = 5 Sample R - square = 0.5.Substitute the quapties in the equation,
$ {R_{adj}^2 = 1 - [frac{(1-0.5^2)(50-1)}{50-5-1}] \[7pt] , = 1 - (0.75) imes frac{49}{44} , \[7pt] , = 1 - 0.8352 , \[7pt] , = 0.1648 }$