We’re seeking a Data Engineer with hands‐on Protegrity data tokenization expertise who can lead Data Loss Prevention (DLP) initiatives and deliver secure tokenization patterns across large SQL estates.
Requirements
- Lead the DLP program end‐to‐end: data discovery & classification, policy design, control implementation, governance reporting, and incident response playbooks.
- Design & deliver Protegrity tokenization for high‐volume SQL environments (e.g., SQL Server, Oracle, PostgreSQL): schema impact analysis, format‐preserving tokenization, detokenization services, performance baselines, and migration runbooks.
- Establish security‐by‐design data patterns: key management (AWS KMS), secrets management, encryption in transit/at rest, and auditable lineage from source to consumer.
- Build and optimize data pipelines on AWS (Glue, Lambda, Step Functions, IAM) to operationalize tokenization at scale and automate governance checks.
- Implement and tune OpenSearch indices and ingestion flows; instrument relevance, resilience, and low‐latency search experiences for downstream applications.
- Govern APIs with Kong Gateway: create authentication/authorization policies, rate limiting, and observability; and apply Kong Mesh for secure service‐to‐service communication (zero‐trust, mTLS, traffic policies) across microservices.
- Partner with stakeholders (Security, Data Architecture, App Engineering, Compliance) to align requirements, define SLAs, and drive adoption of secure data access patterns.
- Mentor engineers and champion secure coding/data handling standards; contribute reusable Terraform/Helm modules and internal best‐practice playbooks.
Benefits
- 401k Matching
- Retirement Plan
- Tuition Reimbursement
- Relocation Assistance
