Aptech Blog » Econometrics
by Eric
3M ago
Introduction In this video, you'll learn the basics of panel data analysis in GAUSS. We demonstrate panel data modeling start to finish, from loading data to running a group specific intercept model. This video is available, along with all GAUSS videos, on our GAUSS YouTube Channel. Be sure to explore all our GAUSS videos and subscribe to the channel to get the latest videos as they are released. Summary and Timeline You'll see firsthand how to: Load and verify panel data. Merge data from different sources. Convert between wide and long form panel data. Explore and clean data. Create panel d ..read more
Aptech Blog » Econometrics
by Eric
4M ago
Introduction In this video, you'll learn the basics of choice data analysis in GAUSS. Our video demonstration shows just how quick and easy it is to get started with everything from data loading to discrete data modeling. Summary and Timeline You'll see firsthand how to: Load and verify survey data. Compute descriptive statistics. Merge data from different sources. Create basic scatter and frequency plots. Fit a basic probit model. Timeline 0:52 Load and verify CSV survey data. 2:53 Change the base case of a categorical variable. 5:24 Merge dataframes. 06:40 Descriptive statistics. 09:25 XY ..read more
Aptech Blog » Econometrics
by Eric
7M ago
Introduction Survey data is a powerful analysis tool, providing a window into people's thoughts, behaviors, and experiences. By collecting responses from a diverse sample of responders on a range of topics, surveys offer invaluable insights. These can help researchers, businesses, and policymakers make informed decisions and understand diverse perspectives. In today's blog we'll look more closely at survey data including: Fundamental characteristics of survey data. Data cleaning considerations. Data exploration using frequency tables and data visualizations. Managing survey data in GAUSS. Wh ..read more
Aptech Blog » Econometrics
by Eric
8M ago
Introduction Anyone who works with panel data knows that pivoting between long and wide form, though commonly necessary, can still be painstakingly tedious, at best. It can lead to frustrating errors, unexpected results, and lengthy troubleshooting, at worst. The new dfLonger and dfWider procedures introduced in GAUSS 24 make great strides towards fixing that. Extensive planning has gone into each procedure, resulting in comprehensive but intuitive functions. In today's blog, we will walk through all you need to know about the dfLonger procedure to tackle even the most complex cases of transfo ..read more
Aptech Blog » Econometrics
by Eric
1y ago
Introduction The new GAUSS Machine Learning (GML) library offers powerful and efficient machine learning techniques in an accessible and friendly environment. Whether you're just getting familiar with machine learning or an experienced technician, you'll be running models in no time with GML. Machine Learning Models at Your Fingertips With the GAUSS Machine Learning library, you can run machine learning models out of the box, even without any machine learning background. It supports fundamental machine learning models for classification and regression including: Logistic regression. LASSO and ..read more
Aptech Blog » Econometrics
by Eric
1y ago
Introduction Logistic regression has been a long-standing popular tool for modeling categorical outcomes. It's widely used across fields like epidemiology, finance, and econometrics. In today's blog we'll look at the fundamentals of logistic regression. We'll use a real-world survey data application and provide a step-by-step guide to implementing your own regularized logistic regression models using the GAUSS Machine Learning library, including: Data preparation. Model fitting. Classification predictions. Evaluating predictions and model fit. What is Logistic Regression? Logistic regression ..read more
Aptech Blog » Econometrics
by Eric
1y ago
Introduction If you've ever done empirical work, you know that real-world data rarely, if ever, arrives clean and ready for modeling. No data analysis project consists solely of fitting a model and making predictions. In today's blog, we walk through a machine learning project from start to finish. We'll give you a foundation for completing your own machine learning project in GAUSS, working through: Data Exploration and cleaning. Splitting data for training and testing. Model fitting and prediction. Background Our Data Today we will be working with the California Housing Dataset from Kaggle ..read more
Aptech Blog » Econometrics
by Eric
1y ago
Introduction Principal components analysis (PCA) is a useful tool that can help practitioners streamline data without losing information. In today’s blog, we’ll examine the use of principal components analysis in finance using an empirical example. Specifically, we’ll look more closely at: What PCA is. How PCA works. How to use the GAUSS Machine Learning library to perform PCA. How to interpret PCA results. What is Principal Components Analysis? Principal components analysis (PCA) is an unsupervised learning method that results in a low-dimensional representation of a dataset. The intuition ..read more
Aptech Blog » Econometrics
by Eric
1y ago
Introduction Forecasts have become a valuable commodity in today's data-driven world. Unfortunately, not all forecasting models are of equal caliber, and incorrect predictions can lead to costly decisions. Today we will compare the performance of several prediction models used to predict recessions. In particular, we’ll look at how a traditional baseline econometric model compares to machine learning models. Our models will include: A baseline probit model. K-nearest neighbors. Decision forests. Ridge classification. The aim of today’s blog isn’t to provide a definitive answer on what model ..read more
Aptech Blog » Econometrics
by Eric
1y ago
Introduction In today's blog, we examine a very useful data analytics tool, kernel density estimation. Kernel density estimation (KDE), is used to estimate the probability density of a data sample. In this blog, we look into the foundation of KDE and demonstrate how to use it with a simple application. What is Kernel Density Estimation? Kernel density estimation is a nonparametric model used for estimating probability distributions. Before diving too deeply into kernel density estimation, it is helpful to understand the concept of nonparametric estimation. Nonparametric Estimation Unlike tradi ..read more

OR