job summary: Looking for Sr. Data Engineer - 5x onsite in Glendale/Burbank CA Must have 3+ years experience in a Data Engineering rolewith Pythonwith designing, building and maintaining ETL/ELT data pipelines Knows SQL Query optimizationsBONUS: Snowflake, Gitlab, Airflow, AWS S3, Relational Databases.Tableau ++ 2+ years with cloud based technologies - AWS and S3 using Apache Airflow, DatabricksExperience in streaming, media is a BONUS lo
[
"
The Disney Decision Science and Integration (DDSI) analytics consulting team is responsible for supporting clients across The Walt Disney Company including Disney Parks, Experiences and Products (e.g., Walt Disney World, Disneyland, Disney Cruise Line), Disney Media & Entertainment Distribution (e.g., Disney+, ESPN+, Hulu), Studios Content (e.g., The Walt Disney Studios, Disney Theatrical Group), General Entertainment Content (e.g., 20th Television, ABC Entertainment, ABC News) and ESPN and Sports Content. DDSI leverages technology, data analytics, optimization, statistical and econometric modeling to explore opportunities, shape business decisions and drive business value.
The team is involved in various activities ranging from data acquisition and validation, designing and implementing ETL/ELT data pipelines, designing and implementing databases, and evolving our next generation data platform to fulfill the needs of our applications, data services, ad-hoc analytics and self-service/POC initiatives.
The Data Engineering team (within DDSI) is looking to fill a Senior Data Engineer
Responsible for:
\t- designing and implementing\t
\t\t- ETL/ELT data pipelines\t\t
- database schema/tables/views\t
\t\t - building batch processes leveraging Airflow for several projects.\t
- partnering with our Studios or Enterprise Technology team members in various activities around data requirements gathering, data validation scripting and review, developing and monitoring ETL/ELT data pipelines, designing and implementing database schema/tables/views, plus deployment across multiple environments such as DEV, QA/UAT and PROD.\t
- leveraging a multitude of technologies to fulfill the work including, but not limited to SQL, Python, Docker, Gitlab, Airflow, Snowflake, Looker and PostgreSQL
Basic Qualifications:
\t- 3+ years experience \t
\t\t- in a Data Engineering role\t\t
- with Python\t\t
- with designing, building and maintaining ETL/ELT data pipelines\t
\t\t - \t\t4+ years experience with SQL\t
\t\t- SQL Query optimization based on runtime and cost\t\t
- SQL, Python, Snowflake, Gitlab, Airflow, AWS S3 and Relational Databases ++\t\t
- with Snowflake, Airflow, Gitlab,Tableau ++\t
\t\t - 2+ years\t
\t\t- with cloud based technologies, preferably AWS and S3\t\t
- using Apache Airflow, Databricks\t\t
- leveraging, designing and building relational databases using Gitlab/Github\t\t
- working on a cloud platform++\t\t
- working with data lakes, data warehouses and application databases++\t\t
\t\t\t- working with large datasets and big data technologies, preferably cloud-based, such as Snowflake, Databricks, or similar\t\t
\t\t\t\t - Participating in driving best practices around data engineering software development processes\t\t
- Experience in streaming, media, or digital marketing domain +++\t
\t
Must be onsite in Burbank. 2 step interview process. First Interview is 30 minutes, round 2 is a one-hour panel interview.
"
]