Violet Forest // ⍣٭⋆⋆⍣

⋆⋰⋆⍣٭⍣ scroll → ٭⋆⍣⋆⋰

Technical Artist at Meta (2024-2023)

Virtual RealityMixed RealityQuest 3

As a Technical Artist for Horizon Worlds, I collaborated with engineers to facilitate building Meta's own proprietary game engine. This includes game asset ingestion for animaton, VFX, USD, materials, and scripts. Programs I have been working with include Maya, Blender, PopcornFX, and Perforce. I also made a Mixed Reality prototype for the internal hackathon.

Technical Artist for Anouschka (2023)

Augmented RealityUnityShader Graph

Interactive shaders with shadergraph and scene lighting in Unity. Prototyping with Niantic Lightship.

Floodplains.xyz VR (2022-2023)

C#UnityTechnical Artist

A Unity game for WebVR and Quest 2/3. I built the game with Unity and integrated it into the website. Game mechanics included picking up objects with raycast, and incorporated a timer that ran out after the house was flooded. I also integrated the assets created by the artist and producer Michelle Brown, and was given creative control over the water shader.

Creator for Snap Spectacles (2021)

Augmented RealityAR GlassesHand-TrackingLens Studio

In partnership with Spectacles, I was commissioned to create a Lens for Snap's Augmented Reality glasses. The Lens was released for the Spectacles and for the Snapchat app. I was given full creative control over the project, where I created a sidequest where you could talk to an NPC in the world and be given a quest to gather items using handtracking, and in return of completing the quest you are rewarded with a magic power. The concept was that when AR glasses become so available and uniquitious in the future, there could be an MMORPG-like game where there are NPCs and characters placed in the world that you could interact with throughout your day to level up things like power, armor, weapons, and other attributes.

Spark AR Face Filter (2023)

Augmented Reality

Spark AR

Hand-Tracking in VR (2023)

Virtual Reality

Quest

Hand Tracking

Parenting particle systems to the joints of the fingers. Made with WebXR + Unity.

UX Design Technologist (2020-2018)

UnityVFXGraphShaderGraphVirtual RealityMixed RealityLeap MotionWifi MicrocontrollersMQTTFramerXTouchdesignerLEDsVoice AssistantsRunwayMLFace RecognitionEye TrackingPose Estimation

I worked as a full-time employee at Volkswagen Future Center Europe in Berlin, Germany. I collaborated with UX designers and technologists to conceptualize and rapid prototype UX solutions for Level 3-5 self-driving vehicles in private and car-pooling situations.

The following includes some rapid prototyping projects I have worked on:

  • Concepts and prototypes for UX solutions using facial recognition, pose estimation, and eye tracking.
  • An audio-reactive visualization using Unity's VFX and ShaderGraph.
  • UI flows using state-machines in Unity.
  • Prototypes for automation systems using MQTT, wifi Microcontrollers, raspberry pis, and LEDs.
  • Concepts and designs for voice and digital assistants.

We also collaborated with Porsche on a mixed-reality experience.

Designer & Developer for AR + Computer Vision App

openCVARKitUnity

I was hired to make a mobile app that can detect the shape of a heart candy & place augmented reality on top of it. This research had led me to experiment with openCVforUnity contour detection, color detection, and training a haar cascade for object detection. In the end I settled on using openCVforUnity contour detection to trigger ARfoundation’s pointcloud feature detection.

My research in computer vision has sparked my interest in machine learning and computer vision in general, and I am continuing my research with things like Google’s new open-source Mediapipe for handtracking, where I collaborated on building an app to control open-source prosthetics with a mobile app.

⍣back-->