This key position will be intergral in using data to support our Education mission. This position is responsible for the design, specifications, coding, testing, implementation, maintenance, documentation, debugging, and troubleshooting of data warehouse solutions. Its work includes analyzing source system data, designing dimensional data models, developing ETLs, and creating data visualization products such as reports and dashboards. The position plays a critical role in requirement gathering, project plan planning, technical architecture design, go-live implementation, and ongoing system support. In addition to these duties, the position is required to perform any other tasks assigned to support the department’s function.
A data engineer is responsible for designing, developing, testing, documenting, and maintaining data warehouse and analytics architecture to meet UCSF’s data and analytics needs.
- Collaborate with business and technology partners to gather business requirements and create architectural designs spanning UCSF missions: Education, Research, Health, and Financial and Administrative Services
- Use business domain knowledge to profile data and determine the best approaches to extract data into data warehouses.
- Design data models and ETLs to in the Education Data Warehouse for all UCSF Schools and Programs.
- Apply the Kimball dimensional data modeling technique to build and enhance the enterprise data warehouse; this ties to the need for enterprise design intention, connecting Education to other UCSF mission assets.
- Design, develop, test, document, and support new and existing ETL processes using Microsoft SQL Server Integration Services (SSIS), AWS Glue, Azure Data Factory, IBM InfoSphere DataStage, or other tools.
- Write ETL specifications for more junior data engineers to execute.
- Demonstrate fluency in SQL programming and performance tuning.
- Develop business intelligence products, such as reports and dashboards, using Tableau or other data visualization tools.
- Document business rules and metadata in the data dictionary and keep the information updated.
- Possess excellent communication skills and the ability to articulate system designs and patterns to varying levels of leadership.
- Participate in project planning, including scoping backlogs and determining estimates.
- Assume on-call duties to support the operation of reporting and analytics systems.
- Maintain appropriate business domain knowledge in healthcare, education, research, finance, and business administration.
The final salary and offer components are subject to additional approvals based on UC policy.
To see the salary range for this position (we recommend that you make a note of the job code and use that to look up): TCS Non-Academic Titles Search (https://tcs.ucop.edu/non-academic-titles)
Please note: An offer will take into consideration the experience of the final candidate AND the current salary level of individuals working at UCSF in a similar role.
For roles covered by a bargaining unit agreement, there will be specific rules about where a new hire would be placed on the range.
To learn more about the benefits of working at UCSF, including total compensation, please visit: https://ucnet.universityofcalifornia.edu/compensation-and-benefits/index.html
- Epic Clarity Administration or Data Model
- Experience with AWS Glue, Microsoft Azure Data Factory, IBM InfoSphere DataStage, or any other ETL tool besides Microsoft SSIS.
- Experience with cloud computing and technologies.
- Experience in Agile working environment is desired.
- Experience in one programming language such as Python, R, Java, C++, C+, etc.
- Experience with clinical and healthcare administration data
- Experience with healthcare, education, research, financial, or business administrative data.
- Experience with the Epic electronic health record databases including Clarity and Caboodle.
- Advanced knowledge of secure software development
- Highly advanced skills associated with software specification, design, modification, implementation and deployment of large-scale scope.
- Excellent project leadership and management skills.
- Able to resolve and improve performance issues. Experience in working with other technical and business teams to carry out system performance tuning.
- Bachelor’s degree in related area OR work-related experience in information technology with emphasis on database / data warehouse systems support.
- Minimum 5 years experience overall in the following technical areas:
- Microsoft SQL Server Database Administrator
- T-SQL. Experienced in writing complex SQL queries and stored procedures. Adept at performance tuning.
- Microsoft SQL Server Integration Services (SSIS). Demonstrate the ability to create, code, and test ETL routines with a heavy emphasis on data sourcing, ingestion and transformation in multiple steps across multiple ETL jobs. Possess an understanding of the principles of effective ETL design.
- Administering and orchestrating Jobs in Microsoft Integration Services Server.
- Experience developing and executing complex test plans.
- Demonstrated familiarity with AWS tools and terms (S3, EC2, EBS, DMS).
- Possesses working knowledge of networking and firewalls.
- Demonstrate effective communication and interpersonal skills. Demonstrate ability to communicate technical information to technical and non-technical personnel at various levels in the organization.
- Demonstrate complex problem-solving skills.
- Detail oriented technical and process documentation writing and maintenance.
- Must be able to work with multiple RDBMS environments such as Oracle, SQL Server, MySQL, and so on.
- Self-motivated and works independently and as part of a team. Able to learn effectively and meet deadlines.
- Demonstrated ability to automate functions in AWS (e.g. Lambda functions, PowerShell scripts).