Where Spatial Computing Meets Spatial intelligence

Thank you for your interest in Meta{Parametric}.
We are continuously exploring new collaborations and regularly seek talented individuals to join our efforts at the intersection of spatial computing, and AI.
If you are passionate about these domains and are interested in contributing to innovative research, development, or design, we welcome you to send your CV and portfolio to: jobs@metaparametric.com.
We look forward to connecting with professionals, researchers, and creators who share our vision for the future of immersive and intelligent environments.

Current Openings at Meta{Parametric}:

[Paid Roles (Contract-Based)]:

1. AR Application Developer (for Ray-Ban Meta Wayfarer AI Glasses)

Number of Positions: 1

Location: Hybrid/Remote

Start Date: July 2025

Duration: Long-term

Project Focus: AI-powered Augmented Reality for Architecture & Urban Design

Are you passionate about augmented reality and spatial computing? We’re seeking a skilled and creative AR Application Developer to join an exciting research-driven project focused on developing immersive, AI-enhanced AR experiences for Meta Ray-Ban smart glasses.

You will contribute to building an innovative platform for voice-controlled AR interfaces, real-time digital twin visualization, and context-aware overlays, collaborating across disciplines in a rapidly evolving technological space.

Key Responsibilities:

  • Design, develop, and deploy AR experiences optimized for Meta Ray-Ban smart glasses, using Meta Spark Studio, Meta View App SDK, and optional Unity-based prototypes.
  • Integrate AI-powered interactions, including voice control (via Whisper or Speech SDKs), real-time transcriptions, and generative visual overlays, through external cloud services (OpenAI or Firebase or Azure).
  • Design intuitive, voice-activated, hands-free 3D UI/UX tailored for smart glasses, focusing on spatial interaction, minimal interface, and seamless scene integration.
  • Collaborate with a multidisciplinary team of XR designers, AI researchers, and cloud developers to deliver immersive architectural and urban-scale experiences.
  • Optimize assets and 3D models using tools such as Blender or Cinema4D, ensuring performance and visual fidelity on wearable devices.

Required Skills:

  • AR SDKs: Meta Spark Studio, Meta Spark Player + (Unity with AR Foundation, Vuforia, or 8thWall)
  • Device Integration: Meta View App SDK, Meta XR Voice SDK
  • Programming: TypeScript/JavaScript, Python/Node.js + (C#)
  • AI Integration: OpenAI Whisper + GPT API/Speech SDKs/OpenAI/LangChain /LLM APIs
  • Cloud / Backend Tools: Firebase/Firestore/Azure Functions/REST APIs /WebSockets
  • 3D Content Creation: Blender/Autodesk Maya/Cinema4D

Bonus:

  • BIM/GIS tools such as Revit, InfraWorks, Cesium, or ArcGIS (intended for future visualization of BIM, GIS, and Digital Twin models in AR via cloud-based workflows)
  • Wearable UX/UI best practices.

Contract Type & Compensation:
This is a project-based, part-time contract role. Compensation will be negotiated based on experience, availability, and scope of contribution. The position is supported by research activities and may extend depending on funding and project progress.

Send your CV and portfolio (plus GitHub or LinkedIn profile) to: jobs@metaparametric.com
Subject Line: AR Ray-Ban Meta Wayfarer Developer – [Your Name]

2. Meta {P.} Project *: AR Application Developer for Field Support (for Smartphone/Tablet-based Remote Assist)

Number of Positions: 1
Location: Remote / Hybrid
Start Date: July 2025
Duration: Project-based (with potential for ongoing solution development and deployment support)

MetaParametric is looking for an experienced AR Developer to create a mobile-based augmented reality application focused on remote field support. The solution enables on-site personnel to share live spatial views with remote experts via smartphone/tablet AR, enabling real-time collaboration, object identification, and interactive overlays. Inspired by Microsoft’s HoloLens Remote Assist but optimized for iOS/Android environments, this tool bridges the gap between experts and frontline operators—without requiring wearables.

Key Responsibilities:

  • Develop AR-based remote collaboration tools for iOS/Android using Unity (AR Foundation) or native AR SDKs (ARKit/ARCore).
  • Integrate video streaming, annotation overlays, and 2-way voice interaction.
  • Enable object targeting, highlighting, and contextual feedback using real-world scene understanding.
  • Optimize the UI/UX for smooth field operation under variable connectivity and lighting.
  • Collaborate with backend developers for secure session handling and cloud-based data sync (Firebase, Azure, Supabase, etc.).
  • Prototype remote handoff, messaging, and expert dashboard components.

Required Skills:

  • AR Development: Unity with AR Foundation, or native ARKit/ARCore development.
  • Real-time Communication: WebRTC, Agora, or Twilio integration.
  • Mobile Platforms: Experience with iOS and Android deployment and testing.
  • 3D Interaction Design: Spatial UI overlays, gesture-based interactions, object anchoring.
  • Cloud Integration: Firebase, Azure Communication Services, or similar.
  • Strong grasp of camera calibration, object tracking, and scene mapping.

Bonus:

  • Prior development of remote assist AR apps (similar to TeamViewer Pilot or Microsoft Remote Assist).
  • Familiarity with cross-device AR synchronization and persistent anchors.
  • Experience with AI-assisted scene understanding (object classification or segmentation).
  • Voice control integration with LLMs or local command logic.
  • Experience with on-device AI inference (TensorFlow Lite, Core ML).

Contract Type & Compensation:

This is a fixed-term collaborative position. Payment will be offered on a milestone or deliverable basis, based on the development phases of the mobile AR solution. Engagement terms are flexible and negotiable for top-tier candidates, with potential for integration into the MetaParametric product suite and extended support for field pilot deployments.

Apply by sending your portfolio, CV, and relevant project samples to: jobs@metaparametric.com
Subject Line: Meta{P.} AR Field Support Developer – [Your Name]

3. Meta {P.} Project: AR Application Developer for Architectural Recognition (for Smartphone/Tablet-based)

Number of Positions: 1
Location: Remote / Hybrid
Start Date: July 2025
Duration: Project-based (with opportunity for long-term enhancements and integration)

MetaParametric seeks an expert AR Developer to build a smartphone/tablet application that leverages AR to recognize architectural elements of buildings in real-time. The app will identify building names, types, and functions based on location, shape, and facade features, while providing precise spatial measurements through advanced distance estimation. This innovative tool will empower architects, urban planners, and heritage professionals with on-the-go building intelligence.

Key Responsibilities:

  • Design and develop AR applications using Unity (AR Foundation) or native ARKit/ARCore SDKs for iOS/Android.
  • Implement real-time object recognition and classification algorithms for architectural features.
  • Integrate GIS data, GPS/location services, and computer vision for contextual building identification.
  • Develop robust distance and dimension measurement features using camera-based depth sensing and spatial mapping.
  • Optimize app performance under varying outdoor lighting and complex urban environments.
  • Collaborate with UX designers to ensure intuitive user interaction and accurate overlay visualization.

Required Skills:

  • AR Development: Proficiency in Unity with AR Foundation or native ARKit/ARCore.
  • Computer Vision: Experience with image recognition frameworks (OpenCV, TensorFlow Lite, Core ML).
  • GIS & Location Services: Integration of GPS, map APIs (Google Maps, Cesium, Mapbox), and spatial databases.
  • 3D Spatial Mapping: Depth estimation, plane detection, and environment understanding.
  • Programming skills in C#, Swift, Kotlin, or Java for cross-platform deployment.
  • Strong understanding of mobile device camera calibration and sensor fusion.

Bonus:

  • Experience with building information modeling (BIM) data integration.
  • Familiarity with 3D reconstruction or photogrammetry techniques.
  • Knowledge of urban data ontologies or semantic spatial databases.
  • Exposure to AI-powered feature extraction or generative models for architectural details.
  • Prior experience in heritage conservation or smart city applications.

Contract Type & Compensation:

This is a fixed-term collaborative position. Compensation will be structured on a milestone or deliverable basis, aligned with development phases and feature rollouts. Engagement terms are flexible and negotiable for exceptional candidates, with potential integration into MetaParametric’s suite of AR tools and future product scaling.

Apply by sending your portfolio, CV, and relevant project samples to: jobs@metaparametric.com
Subject Line: Meta{P.} AR Architectural Recognition Developer – [Your Name]

4. Meta {P.} Project: AR Application Developer (AR-Quran)

Number of Positions: 1
Location: Remote / Hybrid
Start Date: July 2025
Duration: Project-based, with potential for long-term cultural and educational AR applications

MetaParametric is looking for a skilled AR Application Developer to create an immersive AR experience that intelligently displays relevant Quranic verses when users point their device cameras at natural environments or significant locations. The project blends religious education with cutting-edge XR/AR technology, enabling spiritual and contextual engagement inspired by concepts like the verse “فَأَينَمَا تُوَلُّوا فَثَمَّ وَجْهُ اللَّهِ”.

Key Responsibilities:

  • Develop AR applications for smartphones/tablets using Unity with AR Foundation, ARKit, or ARCore.
  • Implement real-time scene recognition and contextual triggering of Quranic verses based on environmental features.
  • Integrate AI models and NLP for semantic matching between detected scene elements and Quranic content.
  • Design intuitive UI overlays that present verses gracefully and accessibly in the AR view.
  • Optimize performance for varied lighting and outdoor conditions.
  • Collaborate with scholars or content experts to ensure accurate and respectful verse representation.

Required Skills:

  • Proven experience with AR development using Unity (AR Foundation), ARKit, or ARCore.
  • Familiarity with computer vision or image recognition techniques.
  • Experience with NLP (Natural Language Processing) or integration of AI for semantic/contextual applications.
  • Proficiency in C#, Swift, Kotlin, or Java for cross-platform development.
  • Ability to design smooth UI/UX tailored for AR experiences.
  • Strong collaboration and communication skills.

Bonus:

  • Experience with voice recognition or multimodal interactions in AR.
  • Knowledge of cloud integration for dynamic content updates.
  • Familiarity with XR spatial audio or immersive experience enhancements.
  • Prior work on educational or cultural AR projects.

Contract Type & Compensation:

This is a fixed-term collaborative position. Payment will be provided on a milestone or deliverable basis, tied to app development phases and feature completion. Engagement terms are flexible and negotiable for exceptional candidates, with possibilities for further AR projects in cultural and educational domains.

Apply by sending your portfolio, CV, and relevant project samples to: jobs@metaparametric.com
Subject Line: Meta{P.} AR-Quran Developer – [Your Name]

5. Multimodal AI-Powered 3D UI Designer (for AR/MR Smart Glasses)

Number of Positions: 1

Location: Hybrid/Remote

Start Date: July 2025

Duration: Project-based, with potential for long-term research collaboration

Project Focus: AI-Augmented Digital Twins & Conversational Interfaces

MetaParametric is seeking a Multimodal 3D UI/UX Designer with experience in AI-powered interface design and spatial computing. The role involves the conceptualization and prototyping of 3D user interfaces optimized for AR smart glasses (such as Meta Ray-Ban, HoloLens 2 or Magic Leap 2), focusing on multimodal interactions: voice, gesture, gaze, and touch.

The selected candidate will contribute to the development of next-generation XR platforms for immersive design collaboration, real-time monitoring, and human-AI interaction.

Key Responsibilities:

  • Design 3D user interfaces and interaction systems for AR/MR smart glasses.
  • Prototype multimodal input flows:
    • Voice (VUI)
    • Gesture (e.g., via Leap Motion / HoloLens)
    • Gaze (eye-tracking)
    • Touch (trackpad or controller)
  • Collaborate on UI pipelines with Unity/Unreal developers and AI engineers.
  • Integrate AI/LLM-powered conversational UIs using tools like Voiceflow, OpenAI Function Calling, or MindStudio.
  • Conduct usability testing with XR headsets; iterate based on data and cognitive load considerations.
  • Translate AI prompts into spatial UI behaviors (chat overlays, task flows, info panels).

Required Skills:

  • Design Tools:
    • Figma (with 3D plug-ins or Spline)
    • Blender / Adobe Substance 3D
    • Unity UI Toolkit or UGUI
    • Adobe XD / Spline / ProtoPie (for motion/UI prototyping)
  • XR Development Platforms:
    • Unity (XR Interaction Toolkit, MRTK, AR Foundation)
    • WebXR or A-Frame familiarity is a plus
  • Interaction Design Expertise:
    • Experience with voice UX, gesture-based UI, and spatial layout design
    • Knowledge of HCI, cognitive load, and human-centered XR design
    • Cross-disciplinary communication with developers, AI engineers, and BIM/data teams

Bonus:

  • Experience with LLM-integrated interfaces (OpenAI, Gemini, Claude, etc.)
  • Familiarity with Voiceflow, Gumloop, Relay.app, or Stack AI for voice/agent prototyping
  • Understanding of agentic UI behavior, goal-driven UI logic, or AutoGPT-like flows
  • Past projects involving smart glasses UI (Ray-Ban Meta, HoloLens, Magic Leap)
  • Familiarity with Azure Digital Twins, OpenUSD, or 3D scene graphs
  • Knowledge of cloud-based collaboration layers (Photon Fusion, Agora)

Contract Type & Compensation:
This is a fixed-term collaborative position. Payment will be offered on a milestone or deliverable basis, depending on the complexity of the UI/UX work. Engagement terms are flexible and open to negotiation for the right candidate.

Apply by sending your portfolio, CV, and relevant project samples to: jobs@metaparametric.com
Subject Line: Multimodal 3D UI Designer – [Your Name]

6. Spatially Intelligent Digital Twin Developer (for XR Devices)

Positions Available: 1
Location: Remote / Hybrid
Start Date: July 2025
Duration: Project-based (with potential for long-term R&D collaboration)

MetaParametric is recruiting a top-tier XR Developer/Data Integrator to build the next generation of spatially intelligent digital twins. You’ll combine real-world sensing, cloud services, and immersive XR platforms (AR/VR/MR) to enable real-time, AI-enhanced interaction with buildings, infrastructure, and urban systems.

This role bridges data fusion, 3D environments, and smart wearable interfaces (HoloLens 2, Magic Leap 2, Meta Quest 3, Meta Ray-Ban Smartglasses).

Key Responsibilities:

  • Develop interactive spatial digital twins with real-time sensor/data integrations (IoT, BIM, GIS).
  • Implement immersive experiences using Unity/Unreal for AR/MR headsets and mobile devices.
  • Integrate geospatial data, sensor fusion layers, and AI inference pipelines into 3D environments.
  • Connect to real-time data feeds via MQTT, WebSockets, REST APIs, or Azure Digital Twins.
  • Enable user interaction via gaze, voice, and spatial UI layers for smart glasses and tablets.
  • Collaborate with AI and 3D UI/UX teams to synchronize human-AI interaction and conversational agents.

Required Skills:

  • XR Development:
    • Unity (AR Foundation, XR Interaction Toolkit) or Unreal Engine (AR/VR pipelines)
    • MRTK, Vuforia, OpenXR, ARCore/ARKit
  • Digital Twin / IoT Integration:
    • Azure Digital Twins, Unity Reflect, Autodesk Forge, or Cesium
    • MQTT, WebSocket, JSON-based sensor streams, or Digital Twin Definition Language (DTDL)
  • 3D Data Handling:
    • IFC/BIM, glTF, USDz, CityGML, GIS (QGIS or CesiumJS)
  • Programming:
    • C#, Python, or JavaScript for API and backend logic
    • RESTful API consumption and data layer integration
  • Cloud & Edge Platforms:
    • Azure IoT Hub, AWS IoT, Google Cloud IoT, Firebase, or Supabase

Bonus:

  • Familiarity with LLM-enhanced interfaces, agent-based reasoning, or AutoGPT logic chains.
  • Knowledge of Sensor Fusion frameworks (OpenCV, ROS, or NVIDIA Isaac)
  • Experience working with BIM software (Revit API, Rhino/Grasshopper, InfraWorks)
  • Background in urban computing, facility management, or smart construction analytics
  • Exposure to spatial databases (PostGIS, GeoJSON) and location-based services
  • Integration experience with Time-Series Databases (InfluxDB, Azure Data Explorer)

Contract Type & Compensation:

This is a fixed-term collaborative position. Payment will be offered on a milestone or deliverable basis, depending on the scope and complexity of development tasks. Engagement terms are flexible and negotiable for highly qualified candidates, with potential for extension into larger international research projects.

Apply by sending your portfolio, CV, and relevant project samples to: jobs@metaparametric.com
Subject Line: Meta{P.} Spatially Intelligent Digital Twin Developer – [Your Name]

[Collaborative & Volunteer Roles (Unpaid, Credited)]:

7. Contributor (Author): Book Chapter on AI and XR Integration

Number of Positions: 1

Position Type: Academic/Research Collaboration

Location: Hybrid / Remote

Submission Deadline: September 15, 2025

Chapter Submission Format: peer-reviewed contribution

Book Title: “Immersive Intelligence: AI-Driven Approaches in Extended Reality Design”; (2026)

MetaParametric invites expressions of interest for academic contributors to co-author or independently author a book chapter focusing on the integration of Artificial Intelligence and Extended Reality (XR) technologies in architecture, design, spatial computing, education, or simulation.

We welcome contributions grounded in research, applied design, or theoretical frameworks that align with our mission to advance immersive, intelligent environments.

Suggested Topics:

  • Generative AI in XR-driven design workflows.
  • AI agents and conversational interfaces in virtual/augmented environments.
  • AI-enhanced Digital Twins for architecture and urbanism.
  • Ethics, agency, and plausibility in AI-augmented XR systems.
  • Spatial computing, cognition, and machine learning applications.

Contributor Profile:

  • Researchers, PhD candidates, postdoctoral fellows, or experienced practitioners in relevant fields (XR, AI, architecture, HCI, data science).
  • Ability to produce academic writing in English (APA style).
  • Willingness to collaborate on peer review, formatting, and revisions.

Contract Type & Compensation:
This is a voluntary academic contribution aimed at co-authoring a peer-reviewed book chapter. While this position is unpaid, contributors will receive formal authorship credit and full recognition in the publication. Ideal for researchers or professionals seeking academic exposure and collaboration.

Submission Guidelines:

  • Submit an abstract (max 300 words) and short bio (100–150 words) to: founder@metaparametric.com
  • Include 1–2 examples of previous academic writing or relevant publications.
  • Subject Line: AI & XR Integration Author – [Your Name]
8. Contributor (Unity XR Expert / 3D UI Advisor): Book Project on Immersive Design Manual

Number of Positions: 1
Location: Remote / Flexible
Start Date: Rolling (Summer 2025 preferred)
Duration: Short-term to Medium-term (Flexible, based on availability)
Project Focus: Writing an Open-Access Book on 3D User Interface (3D UI) Design for Architecture and Urban Design

We’re seeking a collaborative and experienced professional (Unity XR developer or interaction designer passionate about immersive technologies and knowledge sharing) to contribute as a technical advisor and co-creator for a manual/book on 3D UI design, tailored for architects, urban designers, and spatial computing enthusiasts.

You’ll play a crucial role in shaping the technical depth and practical content of the book, helping demystify 3D UI/UX design in AR/VR/MR for professionals unfamiliar with spatial interaction principles.

Key Responsibilities:

  • Serve as a subject-matter expert on Unity-based 3D UI/UX for VR/AR/MR.
  • Answer structured interview questions and guide content development with best practices, real-world examples, and references.
  • Optionally co-author or be credited as a contributor for relevant chapters.
  • Help build a reference kit of interactive components and scene templates for readers.
  • Collaborate with an interdisciplinary team focused on immersive urban and architectural design tools.

Required Skills:

  • Expertise in Unity (C#) with a strong portfolio of 3D UI/UX for XR applications.
  • Experience with VR/MR/AR platforms like Meta Quest, HoloLens, Magic Leap, or mobile AR (Unity XR Toolkit, MRTK, AR Foundation, etc.).
  • Familiarity with interaction design principles, spatial menus, gaze and gesture control, and contextual UI.
  • Bonus: Knowledge of architectural workflows, digital twins, BIM integration, or visual scripting (Bolt/Playmaker).
  • Bonus: Experience contributing to educational content, tutorials, or open-source tools.

Bonus:

  • Gain visibility as a contributor to a pioneering book/manual aimed at spatial designers.
  • Collaborate with a multi-disciplinary research initiative (MetaParametric) at the intersection of architecture, XR, AI, and cloud computing.
  • Showcase your work, philosophy, and design logic in a resource shared globally.
  • No code deliverables or product deadlines – only guidance, feedback, and mentorship-level involvement.

Contract Type & Compensation:
This is an unpaid, voluntary, and credit-based collaboration. You’ll be fully credited in the book and website, with the opportunity to include your portfolio links, GitHub, or LinkedIn. If preferred, you may contribute anonymously or under a pseudonym.

To Apply:
Send a short intro email with your background, a portfolio link (or sample project), and optionally, your preferred collaboration format (interviewee, chapter reviewer, technical advisor); founder@metaparametric.com

Subject Line: Unity XR Expert / 3D UI Advisor – [Your Name]

* The Meta {P.} Project:

The Meta {P.} Project refers to a series of Augmented Reality (AR) application development initiatives conducted under the official license and intellectual property framework of the MetaParametric™ platform. These projects are designed for commercial deployment and align with the platform’s broader vision of enabling spatial computing solutions across architecture, field operations, digital heritage, and immersive learning domains.
Currently, the Meta {P.} Project is in active development, encompassing multiple use cases such as field support via smartphone/tablet-based AR, architectural recognition, and AI-augmented Quranic experiences using smart glasses and XR-capable mobile devices.
All contributors, including developers, researchers, designers, and collaborators, are required to formally engage by signing a Confidentiality and Intellectual Property Agreement (CIPA) as well as a Commercial Development Contract. These legal instruments ensure protection of proprietary assets, clarify roles and responsibilities, and regulate licensing terms for future commercialization and distribution of the resulting applications.
The Meta {P.} Project operates within a modular and scalable architecture, enabling seamless integration with cloud-based services, AI pipelines, and spatial data layers, thereby positioning it as a cutting-edge framework for next-generation AR experiences.

About the blog

RAW is a WordPress blog theme design inspired by the Brutalist concepts from the homonymous Architectural movement.

Get updated

Subscribe to our newsletter and receive our very latest news.