Overview:
Requirements
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field (or equivalent experience).
- Minimum of 2 years experience working with data operations for enterprises, especially supporting RDF, ETL (Nifi), SQL (DB2, Oracle, etc.) and/or Databricks
- Familiarity with cloud platforms (AWS, Azure, K8S, Docker, Openshift etc) and development frameworks (e.g., microservices, APIs).
- Programming proficiency in Shell scripting (Bash/KSH) and Python
- Familiarity with Machine Learning or Generative AI is highly desirable
- Experience with graph data modeling, particularly in RDF for large enterprise environments.
- Basic understanding of software development, systems architecture, and design patterns.
- Strong analytical and problem-solving skills with a willingness to learn and grow.
- Excellent communication skills and the ability to collaborate effectively with team members and stakeholders.
- A keen interest in technology and a passion for solution design and architecture.
- Demonstrated experience working with a cross-functional and geographically dispersed team.
- Relevant certifications (e.g., AWS, Azure, Kubernetes) are a plus.
- Applicants must be authorized to work for ANY employer in the U.S. We are unable to sponsor or take over sponsorship of an employment Visa at this time.
Responsibilities
- Assist in Solution Design:
- Work with senior architects to design solutions based on business requirements and technical needs.
- Support data modeling, ontology design, and schema development based on customer use cases.
- Help in creating architecture diagrams, technical specifications, and documentation.
- Assist in evaluating and recommending technologies and tools that align with project requirements.
- Collaborate with Teams:
- Collaborate with cross-functional teams including software engineers, sales engineers, analysts, support engineers, project managers and customer success managers to ensure alignment on project goals and deliverables.
- Participate in design and technical discussions to gain insights into solution architecture best practices.
- Technical implementation:
- Provide support to senior architects in identifying technical challenges and issues.
- Help troubleshoot problems and find effective solutions during the design and implementation phases.
- Support cloud-based implementations on AWS, Azure and GPC.
- Assist in ETL processes, data source integration, and pipeline development to ingest customer data into the Stardog's knowledge graph platform.
- Write and execute SPARQL queries, graph transformations, and data validation scripts to ensure accuracy and consistency.
- Work closely with presales engineers to develop proof-of-concept (POC) and proof-of-value (POV) solutions.
- Learning and Development:
- Continuously learn about new technologies, trends, and best practices in solution architecture and Knowledge Graphs.
- Develop a solid understanding of data modeling and software development methodologies.
- Documentation:
- Assist in maintaining technical documentation, including design documents, architecture diagrams, and implementation guides.
- Ensure that all documentation is clear, accurate, and up to date.
- Requirements:
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field (or equivalent experience).
- Minimum of 2 years experience working with data operations for enterprises, especially supporting RDF, ETL (Nifi), SQL (DB2, Oracle, etc.) and/or Databricks
- Familiarity with cloud platforms (AWS, Azure, K8S, Docker, Openshift etc) and development frameworks (e.g., microservices, APIs).
- Programming proficiency in Shell scripting (Bash/KSH) and Python
- Familiarity with Machine Learning or Generative AI is highly desirable
- Experience with graph data modeling, particularly in RDF for large enterprise environments.
- Basic understanding of software development, systems architecture, and design patterns.
- Strong analytical and problem-solving skills with a willingness to learn and grow.
- Excellent communication skills and the ability to collaborate effectively with team members and stakeholders.
- A keen interest in technology and a passion for solution design and architecture.
- Demonstrated experience working with a cross-functional and geographically dispersed team.
- Relevant certifications (e.g., AWS, Azure, Kubernetes) are a plus.