Simulate the explanatory variables. Matrix Form of Regression Model Finding the Least Squares Estimator. Nonlinear regression is a robust technique over such models because it provides a parametric equation to explain the data. create a fair data boundary, which. It’s very helpful to understand the distinction between parameters and estimates. All you need to know are the values to fit the variables in the equation. Following that, some examples of regression lines, and their interpretation, are given. Linear regression analyses such as these are based on a simple equation:. Regression to the Mean. If the regression model is a total failure, SSE is equal to SST, no variance is explained by regression, and R 2 is zero. The term "regression" was coined by Francis Galton in the nineteenth century to describe a biological phenomenon. As a simple example, the data frame USPop in the car package has decennial U. Actual Market Pay Results with Mean, Median and Quartile Regression lines 7. 1) That is, β0 is µ 0 where µ 0 is the mean of the dependent variable for the group coded 0. Find the mean and standard deviation for both variables in context. 8, and the slope of 2 is. Model-Fitting with Linear Regression: Exponential Functions In class we have seen how least squares regression is used to approximate the linear mathematical function that describes the relationship between a dependent and an independent variable by minimizing the variation on the y axis. Least-Squares Regression Equation An equation of a particular form ( linear , quadratic , exponential , etc. 0 would have a predicted self-esteem score of 87 using the regression equation. The first term is the total variation in the response y, the second term is the variation in mean response, and the third term is the residual. 504, and the covariance is 3. It is identical to all forms of regression analysis, focuses on the conditional probability distribution of y given X, rather than on the joint probability distribution of y and X,. Regression noise terms page 14 What are those epsilons all about? What do they mean? Why do we need to use them?. Regression analysis generates an equation to describe the statistical relationship between one or more predictor variables and the response variable. regression equation. Learn how to make predictions using Simple Linear Regression. If you know how to quickly read the output of a Regression done in, you’ll know right away the most important points of a regression: if the overall regression was a good, whether this output could have occurred by chance, whether or not all of the. Techniques and Methods 4–A8. In the least-squares regression model, y i = β 1 x i + β 0 + ε i, ε i is a random error term with mean = 0, and standard deviation σ ε i = σ Given: x, least-squares regression line. By linear regression, we mean models with just one independent and one dependent variable. 1: Remote Procedure Call, Confidence Intervals for Predictions, Visual Tests for Regression Assumptions, 1. In simpler terms, we’ll be finding an equation to represent the correlations present in our dataset. 2 (weight), taking into account the overlap or correla- tion between the predictors. Before you compute the covariance, calculate the mean of x and y. To generate a rule for selecting predictor variables we need a definition for what it means for a regression equation to be efficient. The slope bof a regression line ^y= a+ bxis the rate at which the predicted response ^ychanges along the line as the explanatory variable x changes. The end result of multiple regression is the development of a regression equation. If the parameters of the population were known, the simple linear regression equation (shown below) could be used to compute the mean value of y for a known value of x. a is the interception point. x = 162 pounds SD y = 30 inches. Punishment Linear Regression Using z Scores • Regression to the mean –The tendency of scores that are particularly high. Actually in my case, I did a multiply linear regression. Adjusted R squared = R 2; S. This in formula looks like this and is easy to use with a little practice and thought. In OLS regression, rescaling using a linear transformation of a predictor (e. measures the total variability in Y about the mean. The predicted values not on the best fit line are the residuals in the equation. The larger the value the better the regression line describes the data. Updated 2017 September 5th. Using the covariance formula, you can determine whether economic growth and S&P 500 returns have a positive or inverse relationship. What does regression to the mean mean? Information and translations of regression to the mean in the most comprehensive dictionary definitions resource on the web. Regression noise terms page 14 What are those epsilons all about? What do they mean? Why do we need to use them?. 1 (age) and X. 3, SSy=64 , SSx=4 , mean of y= 30, mean of x= 10. Part of that 6. X is the independent ones. For example, part of height is due to our genes that we inherit from our parents, but there are also other random influences that may affect your height. To clarify this a little more, let’s look at simple linear regression visually. Using confidence intervals when prediction intervals are needed As pointed out in the discussion of overfitting in regression, the model assumptions for least squares regression assume that the conditional mean function E(Y|X = x) has a certain form; the regression estimation procedure then produces a function of the specified form that estimates the true conditional mean function. The least squares regression uses a complicated equation to graph fixed and variable costs along with the regression line of cost behavior. , subtracting one value from every individual score) has no. The goal of regression analysis is to obtain estimates of the unknown parameters Beta_1, , Beta_K which indicate how a change in one of the independent variables affects the values taken by the dependent variable. Recall that variance is the mean squared deviation. This is a method of finding a regression line without estimating where the line should go by eye. 9 is a ratio of two mean squares (variance estimates), which is distributed as F with k and (N-k-1) degrees of freedom when the null hypothesis (that the regression slope is zero?) is true. The simple linear regression model page 12 This section shows the very important linear regression model. Example 1: Find the Deming regression equation for the data in columns A, B and C of Figure 1. How to Modify a Brief Linear Regression Model in Excel. the equation and the interpretation that economists attribute to it is much deeper than short-hand versus full specification. In this simple situation, we. How to Forecast using Regression Analysis. Introduction. We demonstrated using data on heights and weights of some Olympic athletes. In the above example, someone with a GPA of 4. rainfall and crop output. The predicted value of Y is a linear transformation of the X variables such that the sum of squared deviations of the observed and predicted Y is a minimum. MEANING OF REGRESSION:. When we calculate a regression equation, we are attempting to use the independent variables (the X‘s) to predict what the dependent variable (the Y) will be So, what do we mean by “regression assumptions”?. The familiar linear regression equation contain s many pitfall to trap the unwary. There are several ways to find a regression line, but usually the least-squares regression line is used because it creates a uniform line. 2- Least Squares Regression Line (LSRL) Example to investigate the steps to develop an LSRL equation 1. The slope ( B 1 ) is highlighted in yellow below. Find the linear. The total deviation from the mean is the difference between the actual y value and the mean y value. Regression to the mean. What is Regression to mean? Meaning of Regression to mean as a finance term. The correlation coefficient is 0. A regression line, or a line of best fit, can be drawn on a scatter plot and used to predict outcomes for the x and y variables in a given data set or sample data. Estimating and Correcting Regression to the Mean Given our percentage formula, for any given situation we can estimate the regression to the mean. In order to use the regression model, the expression for a straight line is examined flrst. 5 The Algebra of Linear Regression and Partial Correlation Our goal in this book is to study structural equation modeling in its full generality. Linear regression models for comparing means In this section we show how to use dummy variables to model categorical variables using linear regression in a way that is similar to that employed in Dichotomous Variables and the t-test. Example 1: Find the Deming regression equation for the data in columns A, B and C of Figure 1. Lasso regression adds a factor of the sum of the absolute value of the coefficients the optimization objective. Since we only have one coefficient in simple linear regression, this test is analagous to the t-test. • The big issue regarding categorical predictor variables is how to represent a categorical predictor in a regression equation. Suppose you have two variables x and y, with y related to x, but not precisely, there is some random v. It can be expressed as follows: It can be expressed as follows: Where Y e is the dependent variable, X is the independent variable, and a & b are the two unknown constants that determine the position of the line. He drew a circle on a blackboard and then asked the officers one by one to throw a piece of chalk at the center of the circle with their backs facing the blackboard. Graphical example of true mean and variation, and of regression to the mean using a Normal distribution. 3, SSy=64 , SSx=4 , mean of y= 30, mean of x= 10. More realistically, with real data you'd get an r-squared of around. see more ». Given a sample of data, the parameters are estimated by the method of maximum likelihood. A regression model expresses a ‘dependent’ variable as a function of one or more ‘independent’ variables, generally in the form: What we also see above in the Novartis example is the fitted regression line,. Linear regression analyses such as these are based on a simple equation:. (Note that in HLM, you can choose whether or. The coefficient of determination is 0. If the estimated slope coefficient is positive, there is no mean reversion and we need to change the specification. Sometimes the predicted sore is much different than the observed score for a specific individual. 9843, LINEST 0. We apply the sd function to compute the standard deviation of eruptions. The equation of the regression line is calculated, including the slope of the regression line and the intercept. The standard deviation of an observation variable is the square root of its variance. 9/18 = 9236. Linear regression is the highly common and predictive analysis technique used by the mathematicians or scientists. The regression line formula is like the following: (Y = a + bX + u) The multiple regression formula looks like this: (Y = a + b1X1 + b2X2 + b3X3 + … + btXt +u. If you print something. The Mean and Variance of the Transformed Scores. This skill test was designed to test your conceptual and practical knowledge of various regression techniques. The distance between the regression line and data points are the residuals of the regression model. The equation must be chosen so that the sum of the squares of the residuals is made as small as possible. All we need to know is the mean of the sample on the first measure the population mean on both measures, and the correlation between measures. A Regression model predicts a numeric target value for each case in the scoring data. Multiple Regression with Two Predictor Variables. In statistics, the purpose of the regression equation is to come up with an equation-like model that represents the pattern or patterns present in the data. The outcome is assumed to follow a Poisson distribution, and with the usual log link function, the outcome is assumed to have mean , with. In OLS regression, rescaling using a linear transformation of a predictor (e. Simulate data that satisfies a linear regression model. Re: Excel Formula to regression fit Cubic polynomial For whatever reason, charts and LINEST do polynomial regression differently. Thus the vector of tted values, \m(x), or mbfor short, is mb= x b (35) Using our equation for b, mb= x(xTx) 1xTy (36). If the regression model is used for formal inference (p-values and the like), then certain assumptions about the distribution of the residuals should be checked. Chapter 5 Regression 170 160 a~, 150 0. You can check out these. There are many ways to score a regression model. It is prevalent in sport and can explain the “ manager of the month. Regression is much more than just linear and logistic regression. This estimate is based on one particular set of. He drew a circle on a blackboard and then asked the officers one by one to throw a piece of chalk at the center of the circle with their backs facing the blackboard. There is a lot more to the Excel Regression output than just the regression equation. Reading and Using STATA Output. One variable is considered to be an explanatory variable, and the other is considered to be a dependent variable. As you will see below a regression line is a straight line that represents the relationship between an x-variable and a y-variable. yett In this case we expect the time series to vary randomly about its mean, If the mean of your. Note: A simple way to do this is to plot the residuals e i =y i - against the estimated response. The best fitting line, in other words, is the one with the least squares in the deviations between the line and the points on the graph. How to Forecast using Regression Analysis. Lasso regression equation is given as: Here, t is the constrained value. Simulate data that satisfies a linear regression model. The logistic regression model is simply a non-linear transformation of the linear regression. Close to one means it probably will get in. measures the variability in X about the regression line. Multiple Regression - Selecting the Best Equation When fitting a multiple linear regression model, a researcher will likely include independent variables that are not important in predicting the dependent variable Y. There are (k+1) degrees of freedom associated with a regression model with (k+1) coefficients, , ,. The other regression coefficients are the differences between the Basal mean and the other group means. Economists, since the time of Haavelmo (1943) have taken the structural equation Y = beta x + epsilon to mean something totally different from regression,and this something has nothing to do with the distribution of X and Y. The two topics on which attention is focused are : (1) The fundamental assumptions which must be satisfied if the application of the classical linear regression model is to be totally valid. Multiple Linear Regression The population model • In a simple linear regression model, a single response measurement Y is related to a single predictor (covariate, regressor) X for each observation. indicates how much of the total variability in Y is explained by the regression model. 0) By Ken Eng, Yin-Yu Chen, and Julie E. regression equation synonyms, regression equation pronunciation, regression equation translation, English dictionary definition of. , the mean for females) rather than the overall mean. Regression is the measures of the average relationship between two or more variables in terms of the original units of the data. Regression to the Mean. ) The R-squared is generally of secondary importance, unless your main concern is using the regression equation to make accurate predictions. ) The R-squared is generally of secondary importance, unless your main concern is using the regression equation to make accurate predictions. Simple Linear Regression in SPSS STAT 314 1. Prediction and Confidence Intervals in Regression Statistics 621 Prediction and Confidence Intervals in Regression Lecture3 Use formula to generate more values. We demonstrated using data on heights and weights of some Olympic athletes. 1X i =− + Effect of hours of mixing on temperature of wood pulp 0 20 40 60 80 100 246810 Hours of mixing Te m p e r a t ur e 12. It is the starting point for regression analysis: the forecasting equation for a regression model includes a constant term plus multiples of one or more other variables, and fitting a regression model can be viewed as a process of estimating several means simultaneously from the same data, namely the "mean effects" of the predictor variables as well as the overall mean. It also produces the scatter plot with the line of best fit GoodCalculators. Linear Least square regression is the de-facto method for finding lines of best fit that summarize a relationship between any two given variables, constrained by a variable x. The regression effect does not say that an individual who is a given number of SD from average in one variable must have a value of the other variable that is closer to average—merely that individuals who are a given number of SD from the mean in one variable tend on average to be fewer SD from the mean in the other. The "logistic" distribution is an S-shaped distribution function which is similar to the standard-normal distribution (which results in a probit regression model) but easier to work with in most applications (the probabilities are easier to calculate). A table lists the y-intercept and slope estimates along with their hypothesis test results. Regression gives 9 = l32. The total deviation from the mean is the difference between the actual y value and the mean y value. In the following statistical model, I regress 'Depend1' on three independent variables. • The big issue regarding categorical predictor variables is how to represent a categorical predictor in a regression equation. 5—they regress to a lower mean. A regression equation models the dependent relationship of two or more variables. There are two common ways to express the spatial component, either as a Conditional Autoregressive (CAR) or as a Simultaneous Autoregressive (SAR) function (De Smith et al. 8 or higher denote a strong correlation. Once, we built a statistically significant model, it’s possible to use it. This article will quickly introduce three commonly used regression models using R and the Boston housing data-set: Ridge, Lasso, and Elastic Net. MEANING OF REGRESSION:. Define regression equation. If the regression model is a total failure, SSE is equal to SST, no variance is explained by regression, and R 2 is zero. Regression Equation of Y on X: This is used to describe the variations in the value Y from the given changes in the values of X. The primary use of linear regression is to fit a line to 2 sets of data and determine how much they are related. Now let us understand lasso regression formula with a working example: The lasso regression estimate is defined as. Things like stock market prices, golf scores, and chronic back pain inevitably fluctuate. This multiple. The Formula for the Percent of Regression to the Mean. the equation and the interpretation that economists attribute to it is much deeper than short-hand versus full specification. Regression to the Mean • The tendency of scores that are particularly high or low to drift toward the mean over time • Teaching Air Force Training –Good and Bad Days Flying Operant Conditioning Reward vs. We've just recently finished creating a working linear regression model, and now we're curious what is next. LINEAR REGRESSION WITH ONE INDEPENDENT VARIABLE "fitted" regression line and the mean of the data points, the larger the regression equation, the poorer the. org Dictionary. Regression analysis is used when you want to predict a continuous dependent variable or response from a number of independent or input variables. The equation for linear regression is essentially the same, except the symbols are a little different: Basically, this is just the equation for a line. Fit a multiple regression equation giving mean grain yield in terms of mean ear no. Multiple Linear Regression The population model • In a simple linear regression model, a single response measurement Y is related to a single predictor (covariate, regressor) X for each observation. Regression gives 9 = l32. What exactly is the statistical principle known as "regression to the mean"? Its a mathematical artifact of the way we define a regression line. Once you have completed the formula and pressed Enter or return, you will see a single value in the cell, which is the slope of the regression line. It’s used to predict values within a continuous range, (e. The "normal equations" for the line of regression of y on x are:. β1 is the "effect," so to speak, of “moving” or changing from. Regression Estimation - Least Squares and Maximum. The more variance that is accounted for by the regression model the closer the data points will fall to the fitted regression line. What does regression equation mean? Proper usage and audio pronunciation (plus IPA phonetic transcription) of the word regression equation. Analysis Step Two: Find the Mean and Standard Deviation of D. The proper. is the intercept and is the slope. At very first glance the model seems to fit the data and makes sense given our expectations and the time series plot. create a fair data boundary, which. The Variables Essentially, we use the regression equation to predict values of a dependent variable. This confirms what we see in the graph: this line does a poor job of summarizing the relationship. In most cases, we do not believe that the model defines the exact relationship between the two variables. Regression is much more than just linear and logistic regression. The Beta in (standard regression coefficient for the respective variable if it were to enter into the regression equation as an independent variable); The partial correlation (between the respective variable and the dependent variable, after controlling for all other independent variables in the equation);. We should know that the regression equation is an estimate of the true regression equation. Regression”. Technically, B0 is called the intercept because it determines where the line intercepts the y-axis. • The big issue regarding categorical predictor variables is how to represent a categorical predictor in a regression equation. If you share parameters (perform global nonlinear regression) SSRes in the equation above is the Sum of Squares reported by Prism in Global result column for model with shared parameters being fitted, and SSTot is the Sum of Squares of each Y value (from each data set) around the mean of ALL Y values (from all data sets). The dictionary meaning of the word Regression is 'Stepping back' or 'Going back'. How to Forecast using Regression Analysis. (2) The alternative techniques which may be employed when these assumptions are not. It shows the proportion of the variation in yithat is accounted for by the. The logistic regression model is simply a non-linear transformation of the linear regression. Model-Fitting with Linear Regression: Exponential Functions In class we have seen how least squares regression is used to approximate the linear mathematical function that describes the relationship between a dependent and an independent variable by minimizing the variation on the y axis. Enter L1 - Non-exercise activity 2. 50 if both d and s equal zero. 5 The Algebra of Linear Regression and Partial Correlation Our goal in this book is to study structural equation modeling in its full generality. Here is an example of Manual computation of a simple linear regression: For our simple regression model we will take Symptom Score (sym2) as our dependent variable and Impulse Control (ic2) as our independent variable. Regression Equation: the equation of the best-fitting line through a set of data. What does regression to the mean mean? Information and translations of regression to the mean in the most comprehensive dictionary definitions resource on the web. If you have a linear regression equation with only one explanatory variable, the sign of the correlation coefficient shows whether the slope of the regression line is positive or negative, while the absolute value of the coefficient shows how close to the regression line the points lie. Regression to the mean. Multiple Regression with Two Predictor Variables. The Math Behind Polynomial Regression. Linear Regression is a supervised machine learning algorithm where the predicted output is continuous and has a constant slope. There is little extra to know beyond regression with one explanatory variable. 9 R2 =(1−SSR/SST)is 1 minus the proportion of the variation in yi that is unexplained. What exactly is the statistical principle known as "regression to the mean"? Its a mathematical artifact of the way we define a regression line. For each different value of the independent variable, the value of the independent variable can be represented by a random variable whose mean lies on the regression line. Introduction. 0170 x GE + 0. Overview of Logistic Regression Model. Two Common Types of Multiple Regression. linear regression in the field of educational research. ) Unfortunately, the RTO residuals will usually have a nonzero mean, because forcing the regres-sion line through the origin is generally incon-sistent with the best fit. In simple linear relation we have one predictor and one response variable, but in multiple regression we have more than one predictor variable and one response variable. Mean Square Residual (Error) = SS RES /DF RES = 166254. The simplest form of the regression equation with one dependent and one independent variable is defined by the formula y = c + b*x, where y = estimated dependent variable score, c = constant, b = regression coefficient, and x = score on the independent variable. This equation is called the "population on true regression line of Y on X. LINEAR REGRESSION WITH ONE INDEPENDENT VARIABLE "fitted" regression line and the mean of the data points, the larger the regression equation, the poorer the. Predicted Probability from Logistic Regression Output1 It is possible to use the output from Logistic regression, and means of variables, to calculate the predicted probability of different subgroups in your analysis falling into a category. , for the children with zero values on both d and s. Multiple Regression Multiple regression is an extension of simple (bi-variate) regression. A regression equation models the dependent relationship of two or more variables. Definition of regression to the mean in the Definitions. Regression analysis is primarily used for two conceptually distinct purposes. The aim of regression is to find the linear relationship between two variables. The two topics on which attention is focused are : (1) The fundamental assumptions which must be satisfied if the application of the classical linear regression model is to be totally valid. Suppose our regression equation is: Test B = 200 +. If you need help getting data into STATA or doing basic operations, see the earlier STATA handout. E of regression is S e = [∑ei 2/(n-k-1) ]1/2; Sum squared residuals = ∑ei 2 Durbin-Watson stat is the Durbin Watson diagnostic statistic used for checking if the e are auto-correlated rather than independently distributed. Mean Square Regression = SS REG /DF REG = 1527482. He drew a circle on a blackboard and then asked the officers one by one to throw a piece of chalk at the center of the circle with their backs facing the blackboard. Introduction. Posc/Uapp 816 Class 20 Regression of Time Series Page 8 6. Techniques and Methods 4–A8. Regression to the Mean. This equation predicts an average score of 10. The least squares regression equation is listed at the top along with the observed correlation coefficient and other information that describes the model fit. The Regression Line: In previous sections, we saw how to plot calibration data and view a regression equation. While some say that regression to the mean occurs because of some kind of (random) measurement errors, it should be noted that IQ regression to the mean analyses are usually performed by using the method of estimated true scores, that is, IQ scores corrected for measurement error, or unreliability, with the formula : Tˆ = r XX′ (X − M X) + M X. See examples and explanations in this article: Techniques for scoring a regression model in SAS. In this section, we see how to compute the regression equation, and use it effectively. Getting Started. Stepwise regression formula: If you standardize each dependent and independent variable that is you subtract the mean and divide by the standard deviation of a variable, you will get the standardized regression coefficients. rainfall and crop output. He then repeated the experiment and recorded each officer’s performance in the first and second trial. Single records (including regression outliers) can have a big influence on a regression equation with small data, but this effect washes out in big data. Regression lines are lines drawn on a scatterplot to fit the data and to enable us to make predictions. The Regression Line. It involves the following: If the current price is greater than the upper bollinger band, sell the stock If the current price is less than the lower bollinger band, buy the stock The bollinger bands are calculated using an average +- multiplier*standard deviation. This handout is designed to explain the STATA readout you get when doing regression. Applications of regression analysis exist in almost every field. Regression is a statistical measurement that attempts to determine the strength of the relationship between one dependent variable (usually denoted by Y) and a series of other changing variables. The proper. We apply the lm function to a formula that describes the variable eruptions by the variable waiting, and save the linear regression model in a new variable. MULTIPLE REGRESSION USING THE DATA ANALYSIS ADD-IN. 1: Remote Procedure Call, Confidence Intervals for Predictions, Visual Tests for Regression Assumptions, 1. observed Ys will cluster closely around the regression line. For linear regression, we'll be interested in the formula: x is the predictor variable for the response variable y To make a model, we can use the scipy linregress method. Is the equation supported by sound theory? 2. Find the standard deviation of the eruption duration in the data set faithful. In this post you will discover the linear regression algorithm, how it works and how you can best use it in on your machine learning projects. 1: Remote Procedure Call, Confidence Intervals for Predictions, Visual Tests for Regression Assumptions, 1. MULTIPLE REGRESSION (Note: CCA is a special kind of multiple regression) The below represents a simple, bivariate linear regression on a hypothetical data set. Regression noise terms page 14 What are those epsilons all about? What do they mean? Why do we need to use them?. 5 The Algebra of Linear Regression and Partial Correlation Our goal in this book is to study structural equation modeling in its full generality. We've just recently finished creating a working linear regression model, and now we're curious what is next. To implement the simple linear regression we need to know the below formulas. Here is the spreadsheet with this data, in case you wish to see how this graph was built. This in formula looks like this and is easy to use with a little practice and thought. If we had no knowledge about the regression slope (i. It is the random variance that causes some of the samples to have extreme values. This simple linear regression calculator uses the least squares method to find the line of best fit for a set of paired data, allowing you to estimate the value of a dependent variable (Y) from a given independent variable (X). It is the value listed with the explantory variable and is equal to 1. For example, part of height is due to our genes that we inherit from our parents, but there are also other random influences that may affect your height. This is due to random measurement error or, put another way, non-systematic fluctuations around the true mean. The chart gives R2 of 0. Review of Multiple Regression Page 3 The ANOVA Table: Sums of squares, degrees of freedom, mean squares, and F. Consider the following four cases:. The F value is obtained by dividing the Mean Square for regression by the Mean Square for the residual. 5 for the snatch variable into the regression equation) is 233. Linear Regression Calculator This linear regression calculator can help you to find the intercept and the slope of a linear regression equation and draw the line of best fit from a set of data witha scalar dependent variable (y) and an explanatory one (x). Here is an example of Manual computation of a simple linear regression: For our simple regression model we will take Symptom Score (sym2) as our dependent variable and Impulse Control (ic2) as our independent variable. It is assumed that the binary response, Y, takes on the values of 0 and 1 with 0 representing failure and 1 representing success. Regression requires that we have an explanatory and response variable. 9/18 = 9236. Actual Market Pay Results with Mean, Median and Quartile Regression lines 7. The first step is to be clear on what your goal is:. regression equation: A statistical technique used to explain or predict the behavior of a dependent variable. net dictionary.