1. Complete four portfolio projects
Employers want to see your statistics and code. You’ll craft four new projects illustrating data loading, analysis, and reporting. Cloned or demo projects don't cut it – these original works will stand out.
2. Publish a professional portfolio
This is your place to shine on the Internet. Your portfolio should contain your contact information, links to you GitHub and LinkedIn profiles, and most importantly, your work.
3. Get a recommendation from your mentor
Your relationship with your mentor evolves during the course. In the beginning, there will be much more handholding, but soon enough your sessions will evolve into code reviews and discussion around best practices.
1. Identify new opportunities.
When you meet with your career coach, you’ll build a spreadsheet to identify new job opportunities and networking efforts.
2. Craft your applications.
Next, you’ll research the company, send drafts of your resume and cover letter to your career coach, and start submitting applications.
3. Prepare for interviews.
You’ll get ready with mock interviews, modeling samples, and coding challenges after your first month.
4. Manage your pipeline.
Getting a job takes effort, you’ll need to focus on tracking application status and being persistent in the job hunt.
A note from the Head of Education
This is our first new program since the top-rated Web Development Bootcamp. Seeing you succeed is why I work at Thinkful. With our network of 350 mentors, 26 staff at HQ, and 7,000 students, you'll have an unprecedented amount of support and opportunity – I personally guarantee it.
— Grae Drake, Head of Education
Thinkful does a great job of keeping you motivated, customizing the program to fit your needs and goals, and supporting you every step of the way to make sure that you are getting everything and more out of the program. After I completed the program, it was less than a month before I received three separate requests for interviews. Every one I've ever talked to has loved the variety and skillset of my portfolio that I developed with Thinkful.
Jason Walkow, Frontend Developer, LAKANA
My data science mentor was the best I have ever met. After the course, I felt so ready to become an outstanding web analyst with data science knowledge.
Jingru Wu, Data Analyst, Nintendo
Developer, Wycliffy Associates
Full Stack Developer, Goodybag
CSS Wizard, Mumba
Full Stack Developer, Hypeist
Frontend Developer, mywedding.com
Director of Academic Technology at The Perkiomen School
In data science one size does not fit all. That's why Thinkful's 1-on-1 mentorship is so critical: Learn the topics you need to get a job, get certainty about what you already know, and focus on the tools industry demands. Classrooms are one-size-fits-all, but that doesn't work when there's no single path to success. Three times a week, your mentor will help you navigate through difficult concepts, teach you best practices, and provide real-time code reviews.
Employers want to see real work, not exercises and test results. That's why the curriculum guides you to learn the concepts then straight into building unique projects. Learning happens when you connect new concepts and with hands on experience and feedback.
Get interview prep and practice coding challenges from your first month to start the path towards job placement. Before you know it, you’ll be polishing your resume and signing your cover (and offer!) letters.
Join a community of students, mentors, and alum to get your questions answered in real-time, share your success, vent your frustration, and meet people who share the same goals.
Workshops and Q&A sessions
Every student has acesss to 40+ hours of group mentorship every week. Q&A sessions allow students ask any question and get help now while workshops encourage mentors to highlight a specific topic in a project.
The Python data science toolkit
You’ll learn cutting edge Python tools to collect, clean, explore, and analyze data. Work with popular open source tools like Pandas to analyze data frames, Beautiful Soup to scrape and collect novel data sets, and matplotlib to explore data before building sophisticated models with scikit-learn.
Topics: Python, Pandas, NumPy, matplotlib, scikit-learn, Beautiful Soup
Statistics, probability, linear algebra
Understand the mathematics behind your tools. You’ll learn to choose the right for the job (and why), see the limits of your models, and know how to improve your decision making by understanding your toolkit from the ground up.
Topics: Standard Deviation, Correlation, Descriptive Statistics, Linear and Logistic Regression, Probability Distributions, Significance, Vectors, Matrices
Data wrangling and exploratory data analysis
Understand your data *before* diving into the math. Professional data scientists know the value of asking the right question and seeing constraints before diving in too deep. You’ll use Pandas, Juypter notebooks, and matplotlib to quickly explore, visualize, and clean your data.
Topics: SQL, Postgres, SQLite, Jupyter Notebooks, Visualization, Data Cleaning, Pandas, matplotlib, Data Exploration
Harness machine learning
Learn powerful regression and classification methods to model the world and predict the future. Study and use methods like linear and logistic regression, k-means clustering, gradient descent, and principle component analysis to model real-world data.
Topics: Regression, classification, scikit-learn, cross-validation, k-fold, clustering, k-means, gradient descent, PCA
Telling stories with data
Finding solutions is great, but only if you can communicate those solutions back to your team. Your ability to tell stories with data is just as important as your programming and mathematics skills. You’ll learn strategies from top practicing data scientists to understand, visualize, and finally communicate the data you process.
Topics: Data visualization, plotting, seaborn, matplotlib, storytelling, communication
Big data and distributed computing
Modern data sets are large and growing larger. Learn to process data at scale using distributed tools like Hadoop, Hive, Pig, MapReduce and Spark. Deploy your own servers on AWS and tackle enormous data sets.
Topics: Hadoop, Hive, Pig, Spark, MapReduce, distributed computing, AWS, big data