Himalayas logo
TH

IOS Developer (Nigeria)

Talent Hackers
Nigeria only

Stay safe on Himalayas

Never send money to companies. Jobs on Himalayas will never require payment from applicants.

About The Role:

Our client is an evidence-first AI platform for disaster restoration and insurance claims. They capture field evidence (photos, video, moisture logs, LiDAR scans, voice notes) via a mobile-first guided inspection app, build a graph-backed digital twin of the property loss, and use agentic AI to generate carrier-ready claim packets with full audit trails. The iOS app is the primary entry point for all field data and the most critical user-facing surface of the entire platform.

This is NOT a generic consumer app. This is a professional field-operations tool used by contractors on active job sites: water-damaged basements, fire-loss walkthroughs, mold remediation inspections. The mobile experience must work flawlessly in hostile conditions (poor lighting, no connectivity, one-handed operation, wet/dusty environments) and produce structured, auditable evidence that feeds directly into an AI pipeline.

Core Responsibilities:

iOS Development & UI/UX (Primary)

  • Design, build, and maintain the iOS application in Swift (SwiftUI primary, UIKit where needed) as the primary field-capture interface for restoration contractors
  • Own the end-to-end mobile UI/UX: guided inspection workflows, structured photo/video capture with metadata tagging (GPS, timestamp, room/area labels), moisture reading input, voice-to-text notes, and document scanning
  • Implement offline-first architecture using Core Data or SwiftData for local persistence, with reliable background sync and conflict resolution when connectivity is restored
  • Design for field conditions: one-handed operation, large touch targets, high-contrast interfaces for poor lighting, gloved-hand input tolerance, and rapid capture workflows that minimize taps per evidence item
  • Build real-time camera integration with custom capture flows (guided photo angles, overlay templates for consistent documentation, multi-shot sequences per room/area)
  • Implement push notifications, background processing, and local scheduling for inspection reminders, sync status, and evidence completeness alerts

Cloud & AI Integration (Critical)

  • Integrate the iOS app with our backend services via RESTful and GraphQL APIs (Node.js/TypeScript backend)
  • Connect mobile evidence capture to the NVIDIA NIM inference pipeline: upload photos/video that trigger AI classification, damage detection, and scope generation on the backend
  • Integrate NVIDIA VSSS (Video Search and Summarization) outputs into the mobile review experience, allowing field techs to verify AI-generated scope items against video evidence
  • Consume and display outputs from the agentic AI pipeline: scope suggestions, confidence scores, evidence gap alerts, and citation-linked packet previews
  • Integrate with cloud storage (Nebius/S3/MinIO) for evidence upload with progress tracking, resumable uploads, and metadata preservation
  • Implement secure authentication (OAuth2/JWT), role-based access control per project and tenant, and PII-safe data handling throughout the mobile layer

Collaboration & Product

  • Work directly with the CTO and founding engineering team to translate product requirements into polished mobile experiences
  • Collaborate with design (if/when hired) to maintain and evolve the design system; in the interim, own UI/UX decisions informed by Apple Human Interface Guidelines and field-user research
  • Participate in weekly design-partner feedback sessions to understand how restoration contractors actually use the app on job sites and iterate accordingly
  • Write clean, tested, well-documented Swift code; participate in code reviews and architectural discussions
  • Manage App Store submission, TestFlight distribution for pilot partners, and release management

Must-Have Requirements:

iOS / Swift

  • 3+ years of production iOS development in Swift with published App Store applications
  • Experience with LiDAR scanning (ARKit, Polycam SDK), 3D measurement capture, or AR overlay features on iOS
  • Strong proficiency in Swift UI (primary) and UI Kit; ability to build custom UI components and complex navigation flows
  • Hands-on experience with iOS camera APIs (AV Foundation), GPS/Core Location, local storage (Core Data or Swift Data), and background processing
  • Proven offline-first mobile architecture: local persistence, background sync, conflict resolution, and reliable data upload on reconnection
  • Deep understanding of iOS UI/UX principles, Apple Human Interface Guidelines, and accessibility standards
  • Experience with App Store submission, TestFlight distribution, certificate management, and release processes

API & Cloud Integration

  • Experience integrating iOS apps with RESTful APIs and managing asynchronous network calls (URLSession, Combine, or async/await)
  • Familiarity with cloud storage integration (AWS S3 or equivalent) for large file uploads (photos, video) with progress tracking and resumable transfers
  • Experience implementing secure authentication flows (OAuth2, JWT) and role-based access control on mobile
  • Comfortable consuming and displaying AI/ML model outputs in a mobile interface (confidence scores, classification results, suggested actions)

Working Style

  • Experience working in a fast-paced startup environment, ideally as a founding or early-stage engineer
  • Strong written and verbal English communication skills for daily async collaboration (WhatsApp, GitHub, Loom)
  • Ability to work EST-aligned hours for overlap with US-based leadership team
  • Proficiency with Git, pull request workflows, and CI/CD pipelines (GitHub Actions, Fastlane)
  • Agile/Scrum methodology experience; comfort with weekly sprint cycles and rapid iteration
  • High autonomy: you set priorities, unblock yourself, propose solutions, and communicate proactively when stuck

Nice-to-Have:

NVIDIA & AI Stack

  • Experience integrating NVIDIA NIM (NVIDIA Inference Microservices) or NeMo Retriever outputs into mobile applications
  • Familiarity with consuming LLM/RAG pipeline outputs in a mobile context (structured JSON responses, streaming results, confidence thresholds)
  • Experience with on-device ML (Core ML, Vision framework) for real-time photo classification, damage detection, or OCR

Graph & Data

  • Exposure to Neo4j or property graph concepts (understanding how entities, relationships, and provenance flow through the app)
  • Experience with PostgreSQL/ pgvector or vector search concepts as they surface in mobile search/recommendation features

Domain & Platform

  • Experience building technology for insurance, construction, restoration, or regulated field-operations industries
  • Android development skills (Kotlin/Jetpack Compose) for future cross-platform expansion
  • Experience with Flutter or React Native as a complement to native iOS (for rapid prototyping or shared logic layers)

About the job

Apply before

Posted on

Job type

Full Time

Experience level

Mid-level

Location requirements

Hiring timezones

Nigeria +/- 0 hours
Claim this profileTH

Talent Hackers

View company profile

Similar remote jobs

Here are other jobs you might want to apply for.

View all remote jobs

37 remote jobs at Talent Hackers

Explore the variety of open remote roles at Talent Hackers, offering flexible work options across multiple disciplines and skill levels.

View all jobs at Talent Hackers

Remote companies like Talent Hackers

Find your next opportunity by exploring profiles of companies that are similar to Talent Hackers. Compare culture, benefits, and job openings on Himalayas.

View all companies

Find your dream job

Sign up now and join over 100,000 remote workers who receive personalized job alerts, curated job matches, and more for free!

Sign up
Himalayas profile for an example user named Frankie Sullivan
Talent Hackers hiring IOS Developer (Nigeria) • Remote (Work from Home) | Himalayas