Role
Company
What you'll do:
- Design and build features, and/or distributed platforms at scale.
- Drive impactful initiatives for the globally distributed infrastructure.
- Collaborate with product managers, architects, other engineering teams, and business groups, to drive end-to-end solutions.
- Resource management infrastructure powering the big data and machine learning workloads on the Databricks platform in a scalable, secure, and cloud-agnostic way.
- Develop reliable, scalable services and client libraries that work with massive amounts of data on the cloud, across geographic regions and Cloud providers.
What makes you a great fit:
- 3+ years of industry experience designing, building, and supporting large-scale systems in production.
- Fluency in Python, NodeJs, and Go.
- Good knowledge of SQL.
- Knowledge of algorithms and data structures. Familiarity with database internals data governance, and/or payment systems is a huge plus.
- Experience with cloud technologies, e. g. AWS, Azure, GCP, Docker, or Kubernetes.
- Experience with security and systems that handle sensitive data is a bonus.
- Experience in customer-facing product development and collaboration with cross-functional teams.
- Experience working on a SaaS platform or with Service-Oriented Architectures.
60% Lower TCO and 5-10x Superior Performance for the most demanding analytics workloads on your data lake, lakehouse Existing Data Warehouse and Data Lakehouse Engines fall short when it comes to the most demanding enterprise analytics workloads. Data engineering and Platform teams find it impossible to simultaneously achieve lower TCOs, ever-increasing usage, and query latency / throughput SLA's. e6data’s answer is to drive breakthroughs on the processing efficiency frontier. This pursuit drove us to a clean-slate, bold new approach to building MPP-style columnar processing and vectorized execution engines. Keep what's working, and get started with e6data only on your most expensive and/or challenging workloads on your existing data lakehouse: 60% Lower TCO, and 5-10x Faster Queries and Higher Concurrency 1) Works with your existing Data Lakehouse: Delta / Iceberg / Hudi / Hive 2) Secure, Private: fully within your AWS / GCP / Azure account (incl. existing VPCs) 3) No changes to BI / Reporting Tools, Application Layer, Queries 4) No changes to your ETL, data pipelines 5) Deploys in your existing Kubernetes cluster
Find Popular Jobs on BigShyft.com
Jobs By Skill in Bengaluru
Jobs By Skill
Jobs By Specialization
Apply to Similar Jobs
- SSafe SecuritySoftware Development Engineer IIISecurity TechB2BSaaSSeries BStart-up5y - 8y₹25 - ₹55 LPADelhi, Bengaluru/ BangaloreNode.Js, Python, Golang, Typescript, AWS
- MMindTickleSDE-II (FullStack - Backend Heavy)201-500 employeesPrivate2y - 5y₹15 - ₹40 LPAPune, Bengaluru/ BangaloreNode.Js, Golang, React.Js, AWS, SQL
- IInaiSenior Software EngineerFundedStart-up11-50 employees3y - 6y₹15 - ₹30 LPABengaluru/ Bangalore, ChennaiJava, Python, Golang, Node.Js
- SSimplotelSenior Backend Engineer51-200 employeesPrivate2y - 6y
Competitive pay
Bengaluru/ BangalorePython, Java, Golang, Django, Algorithms - BBetsolBackend EngineerUnfundedStart-up501-1000 employees3y - 7y₹10 - ₹25 LPABengaluru/ BangalorePython, Django, FastAPI
Find Popular Jobs on BigShyft.com
Jobs By Skill in Bengaluru
Jobs By Skill
Jobs By Specialization
- Home
- >
- Jobs in Bengaluru
- >
- Python Jobs
- >
- Python Jobs in Bengaluru
- >
- Backend Engineer