This content originally appeared on DEV Community and was authored by RamyaKaruturi
๐ ๏ธ Foundation Phase Recap (Aug 11 โ Sep 28)
I started my foundation phase on August 11 with the goal of building comfort in scripting, querying, and managing environments. Over the past seven weeks, Iโve explored Linux, Cloud, SQL, and built my first ETL pipeline. Each week had its own focus, and Iโve documented everything on Hashnode and GitHub.
๐น Week 1: Linux Basics and Cloud Introduction
This week was about getting familiar with Linux commands and understanding cloud architecture. I practiced workflows like navigation, file management, permissions, and package handling. On the cloud side, I explored service models and AWS basics.
๐ Hashnode Article: Week 1 & Week 2 โ Linux and Cloud Fundamentals
๐ GitHub Documentation: Week 1 โ Reflections
๐น Week 2: Continued Linux Practice and Cloud Concepts
I refined my Linux skills with process control, scheduling tasks, and monitoring system resources. I also explored IAM roles and EC2 setup in AWS.
๐ Hashnode Article: Week 1 & Week 2 โ Linux and Cloud Fundamentals
๐ GitHub Documentation: Week 2 โ Reflections
๐น Week 3: PostgreSQL Practice and Query Mastery
This week was focused on SQL. I designed a mini sales database and solved 50+ queries across filtering, joins, aggregations, and window functions. It helped me understand relational logic and query design.
๐ Hashnode Article: Week 3 โ PostgreSQL Practice & Query Mastery
๐ GitHub Documentation: Week 3 โ Reflections
โธ๏ธ Sep 1 โ Sep 6: Break Week
I didnโt work on foundation topics this week. I was involved in other tasks not related to this phase, so I didnโt count this as part of the learning timeline.
๐น Week 4: ETL Pipeline Project
This week was all about building. I created a beginner-friendly ETL pipeline using Linux shell scripting, Python (pandas), and PostgreSQL. I extracted, transformed, and loaded drug label data into a structured database.
๐ Hashnode Article: My First ETL Pipeline
๐ GitHub Repository: Linux_ETL_Pipeline
๐น Week 5: Thinking Like a Builder
This week was a mindset shift. I stopped just running commands and started designing systems.
- Practiced shell scripting with error handling
- Explored Docker basics
- Worked with EC2, Lambda, and S3 lifecycle rules
- Documented daily reflections with clarity
๐ Hashnode Article: Week 5 โ The Week I Started Thinking Like a Builder
๐ GitHub Documentation: Week 5 โ Reflections
๐ง Week 6: Internship Work โ Smart Fridge Annotation
After Week 5, I planned to start Phase 2. But as part of my internship, I was assigned to work on data collection and annotation for smart fridge images.
This week, I focused on:
- Collecting diverse fridge images
- Annotating items shelf by shelf
๐ Starting Phase 2: Core Workflows
Now that the foundation phase is complete, Iโm officially starting Phase 2 of my journey.
This phase will focus on how data moves, transforms, and gets scheduled.
๐ง Phase 2 Plan (Sep 29 โ Early Dec)
- Data Extraction: APIs, web scraping, Selenium
- Data Ingestion: Kafka, Spark, Flink
- Orchestration: Airflow, dbt
Iโll continue documenting each week with clarity sharing both technical progress and mindset shifts.
โFoundation gave me clarity. Now Iโm building momentum.โ
This content originally appeared on DEV Community and was authored by RamyaKaruturi

RamyaKaruturi | Sciencx (2025-09-29T04:42:59+00:00) ๐ Foundation Phase Completed – Starting Phase 2 of My Journey. Retrieved from https://www.scien.cx/2025/09/29/%f0%9f%93%98-foundation-phase-completed-starting-phase-2-of-my-journey/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.