Talent.com
serp_jobs.error_messages.no_longer_accepting
Software Engineer, Asset Management Data

Software Engineer, Asset Management Data

Charles SchwabAustin, TX, US
job_description.job_card.variable_days_ago
serp_jobs.job_preview.job_type
  • serp_jobs.job_card.full_time
job_description.job_card.job_description

Your Opportunity

At Schwab, you’re empowered to make an impact on your career. Here, innovative thought meets creative problem solving, helping us “challenge the status quo” and transform the finance industry together. We believe in the importance of in-office collaboration and fully intend for the selected candidate for this role to work on site in the specified location(s).

Schwab Wealth and Asset Management Engineering is a part of the Schwab Technology Services organization supporting Schwab's money management, research, and asset management platforms to help clients manage their wealth.

We are seeking a Data Engineer for building out the cloud native data platform for Schwab Asset Management (SAM). This role is ideal for a professional with progressive experience in cloud native data engineering who is ready to take on more responsibility and operate with minimal supervision. You will be integral to enabling and enhancing our data assets, data pipelines and supporting a data platform on SQL Server, Snowflake and Google Cloud. This is an exciting opportunity to work in a dynamic, data-driven environment, contributing to the ongoing development and optimization of our data platform.

The role requires hands-on development in a client driven technology organization and executing regulatory, tactical and strategic business initiatives focused on developing the data platform and delivery of analytics and reporting projects. The ideal candidate is expected to be a detail oriented and work in Agile and DevOps model in partnership of our business—actively working with Product Owners, End-Users, Partners—in managing requirements, design, coding, testing (unit and functional), deployment and post-release support as well engage in migration activities to evolve the on-premise data stack to the cloud.

The ideal candidate will have :

  • 5+ years of working experience and sound knowledge in building data platforms leveraging cloud (GCP / AWS) cloud native architecture, ETL / ELT and data integration
  • 3-5 years of development experience with cloud services (AWS, GCP, AZURE) utilizing various support tools (. GCS, Cloud Data flow, Airflow (Composer), Cloud Pub / Sub)
  • 3-5 years of experience and sound knowledge in developing reliable data pipelines leveraging data warehouses (Snowflake, Big Query, SQL Server) and data processing frameworks (Apache Spark, Apache Beam, Apache Flink, Informatica, SSIS, Pentaho)
  • Knowledge of NoSQL database technologies (. MongoDB, BigTable, DynamoDB)
  • Expertise in build and deployment tools - (Visual Studio, PyCharm, Git / Bitbucket / Bamboo, Maven, Jenkins, Nexus, TeamCity)
  • Experience in database design techniques and philosophies (. RDBMS, Document, Star Schema, Kimball Model)
  • Experience leveraging continuous integration / development tools (. Bamboo, Docker, Containers, GitHub, GitHub Actions) in a CI / CD pipeline
  • Experience with SQL, ETL, and other code-based data transformation and delivery technologies.
  • Experience in messaging and services-based software, preferably in cloud platform using RabbitMQ, Kafka or the equivalent.
  • Experience with Informatica Developer Tool set or Data Validation Option (DVO) is a plus.
  • Advanced understanding of software development and research tools
  • Attention to detail and results oriented, with a strong customer focus
  • Ability to work as part of a team and independently
  • Analytical and problem-solving skills
  • Problem-solving and technical communication skills
  • Ability to prioritize workload to meet tight deadlines

What you have

To ensure that we fulfill our promise of "challenging the status quo," this role has specific qualifications that successful candidates should have.

Required Qualifications :

  • Bachelor’s degree in computer science, Engineering, Mathematics, or related field
  • 5+ years of experience in data engineering or similar roles
  • Proficiency in SQL and experience with relational databases (., SQL Server, PostgreSQL, Snowflake)
  • Solid experience with Python for data processing, ETL development, tooling and analytics implementations
  • Familiarity with cloud data platforms (., Snowflake, Google BigQuery)
  • Experience building and maintaining ETL pipelines
  • Strong experience in data modeling, including designing normalized and denormalized schemas for financial data
  • Understanding of financial data concepts
  • Knowledge of data quality, validation, and governance best practices
  • Preferred Qualifications :

  • Deep understanding of data architectures and engineering patterns of data pipelines and reporting environments.
  • Familiarity with data visualization tools (Power BI, Tableau)
  • Exposure to regulatory requirements in finance (., GDPR, N-PORT, 13F / G)
  • Analytical and troubleshooting skills to identify and resolve data and platform issues effectively.
  • Ability to work collaboratively within an agile team environment, supporting cross-functional initiatives and contributing to shared goals.
  • Strong documentation skills and the ability to communicate technical concepts clearly and effectively to both technical and non-technical stakeholders.
  • In addition to the salary range, this role is also eligible for bonus or incentive opportunities.

    Why work for us?

    Own Your Tomorrow embodies everything we do! We are committed to helping our employees ignite their potential and achieve their dreams. Our employees get to play a central role in reinventing a multi-trillion-dollar industry, creating a better, more modern way to build and manage wealth.

    Benefits : A competitive and flexible package designed to empower you for today and tomorrow. We offer a competitive and flexible package designed to help you make the most of your life at work and at home—today and in the future.

    serp_jobs.job_alerts.create_a_job

    Software Engineer Data • Austin, TX, US