Search the Community
Showing results for tags 'python'.
Sorry that I'm assuming that the skill set needed for becoming a RA in STR and OT is the same... So, I've done some research, but my responses in terms of the knowledge of the program and programming languages are really really varied. stata: the standard tool used in STR/OT "in the past" R: quickly catching up and replacing stata SQL: mostly no need Excel: very useful Python: not strictly required, but would be very useful if there's no data available (someone told me that very often profs just happen to have a research idea and as ra I'd be expected to find out the relevant data) The thing is that I really don't have the time to learn all these skills...I have to make trade-offs...but I'm wondering, how? A phd student told me to learn stata and python, but another told me to learn r and python as stata is being used much less frequently and r is a much useful program to learn Completely overwhelmed... Besides, can I also know the level of econometrics knowledgeneeded to do ra? Would it be fine if I don't know about it at all? Or a beginner level understanding of it is required? Perhaps even an intermediate one? I was wondering if some of you would be so kind as to help me with these doubts... Thx!
Hi, I am a third year economics and finance student, lately I decided to take an introductory course in programming, as I thought it would be useful. On top of that I am taking a course in mathematical econ. which will give me skills in using the MAPLE software, and I am already using STATA for my econometric classes. My question is, will Python be useful in doing economic analysis, as I am interested in finding ways of using some programming for my final year research thesis, and in a working environment. Thanks.
Since it's often not very clear how an economics researcher would actually go about using Python in their research, I put together a little tutorial that demonstrates a couple combined use cases. The title of the tutorial is "Estimate an Econometric Model from Scraped Data in Real-Time using Python" and you can find it here: live.economics.io The application that's looked at is one of determining the factors that influence the price of a desktop CPU using the prices and features of the processors that are currently listed at newegg.com as the data set for the model. One point of this tutorial is to illustrate how straightforward it can be to collect, clean and analyze data on the Internet. That is, turning unstructured data into structured data, then performing some analysis on that data, all using Python. Another abstraction I wanted to demonstrate was that of thinking about the data you want to analyze in your research as being some txt/xls/csv/etc file on your computer. The ability to literally build or engineer your own data set has many great properties, one of which is that your data set will likely be unique since it is a function of the collection algorithms you write in your scripts (in terms of discovering the agents/firms/products/features that are relevant to analyzing a particular market/etc), and when operating on a larger scale, the configuration of the server [and it's geographic location!] that hosts the websites you're trying to scrape. As always, comments and feedback are encouraged.