Andrew Pollack
Senior Software Engineer with expertise in infrastructure, developer tooling, and cloud development; focused on building scalable systems that boost developer productivity while optimizing cloud costs. Passionate about driving innovation in the software development lifecycle by enabling confident, easily reversible changes through automation, robust CI/CD processes, and internal platform tooling. Skilled in leading technical initiatives, mentoring engineers, and fostering a culture of continuous learning and growth.
Skills
Languages: Python | Go | Rust
Technologies: AWS | Linux | Docker | Kubernetes | Ansible | Terraform | Packer | Bazel | Jenkins | Buildkite | GitHub
Experience
Skydio: Senior Software Engineer L5
Nov 2023 - Present, San Mateo, CA
Cross-functional infrastructure team, designing and building scalable internal tools that empower engineers to write, test, and release software with confidence.
-
Owned and operated core developer infrastructure on AWS and Kubernetes:
- Managed infrastructure provisioning, configuration, and operations using Packer, Ansible, Terraform, ArgoCD.
- Operated CI platforms (Buildkite, GitHub Actions, Prow merge queue, Jenkins).
- Maintained artifact repositories (Nexus Repository, Harbor) and developer cloud workstations (Coder).
- Reduced toil through process automation, improved monitoring, weekly on-call, and runbook processes.
-
Migrated the CI control plane from Jenkins to Buildkite. Outcomes included:
- Enhanced security from an isolated AWS environment and allow-list enforced egress proxying service.
- Development of integration tooling to seamlessly connect current CI tooling with Buildkite.
- New metrics tracking worker infrastructure lifecycle, identifying bottlenecks for data-driven optimizations.
- Implemented a merge-locking mechanism in GoLang for CI merge queue platform (Prow), reducing batch cancellation rates from 20% to under 2%, saving an average of 140 minutes of CI time per batch merge.
- Led a Cloud Spend working group, shifting financial reviews to a weekly cadence and leveraging AWS metrics and improved tagging to reduce CI and development costs by 25%.
- Won hackathon (team of two) by implementing Ruff tooling for Python linting, reducing linting checks per pull request from 60 minutes to under 15 minutes.
Google LLC: Software Engineer L4
Dec 2021 - Sep 2023, Mountain View, CA
Software Delivery team on Fuchsia. Building Fuchsia's software delivery story, including over-the-air (OTA) updates, with security, reliability, and recovery as priority.
- Redesigned and implemented "RFC-0208: Distributing Packages with the SDK," adding new build system templates and target types for the Fuchsia SDK's GN (Generate Ninja) ruleset. Improved versioning, testing, and distribution for packages to downstream customers.
- Developed host-to-target communication protocols for registering blob payload sources, utilizing a Rust-based command-line interface (CLI) for tooling.
- Facilitated the monthly Fuchsia Rust Working Group, organizing discussions on use cases, ecosystem development, source talks, and brainstorming sessions to improve the Rust developer experience.
The Rust Project: Editor/Publisher, This Week in Rust
Nov 2021 - Present, Virtual
Open source weekly newsletter with over 30,000 readers, capturing developments, learning resources, and community events in the Rust community.
- Developed Docker-based publishing workflow, ensuring static website/email generation consistency.
- Participant in weekly publishing rotation, including implementing a Discord bot to ensure coordination.
Fiddler Labs: Software Engineer II, Data Platform
Nov 2020 - Nov 2021, Palo Alto, CA
Data Platform team at Fiddler Labs. Leader in Artificial Intelligence (AI) monitoring and explainability.
- Restructured the application logic of monitoring aggregation system, enabling horizontal scalability through application-level sharding, using a pub/sub (RabbitMQ) pattern deployed on Kubernetes.
- Designed and implemented a new API framework for ingesting logs across different data formats (CSV, Parquet, etc.) and cloud storage options (S3, GCS), reducing novel data integration development time from days to hours.