Experiential Design / Final Task

6/7/2025 - 27/7/2025 (Week 12 - Week 15)
Shawn Wong Kai Hen / 0375372
Major Project 1 / Bachelor of Design (Hons) in Creative Media
Experiential Design - Final Task



  INTRUCTIONS  




  Final Task Overview  

1. Focus create on game scene


Went i start to create my game scene, i want my color shape will spawn at the AR real life, so i try to figure out the ways and finding tutorial to create. But i realize that have no one create before so i have no idea, so i have asking ChatGPT and follow step by step, here im using ground plane and plane finder to create it, so want went detect the ground and tap the ground, the color shape will spawning in the AR camera.

2. Create Game Scene UI


Here i starting to create the UI for the game scene which is have score, timer and menu button on the top, for the border is some decoration to make the AR camera during turn on not to boring, and at the bottom have text to let user know need to tap to start the game. Also went the game start, there will have a voice prompt saying which color user need to tap and also the text will change to the color that the user need to tap.

3. Create Game Over & Main Menu Panel


Because this is a game, so the game over and main menu are needed in the game. The game over panel will pop out went the timer run out. On the panel will show the score score by user. For the main menu panel allowed user to pause the game and can be continue and quit same with the game over panel.

4. Apply UI Elements


Here are some simple part, so i just import my UI design from Adobe Illustrator and apply the design in my game. So here are simple and fast, just align all the timer, score on the elements and done!

5. Game Scene Controller Scripting


Here are the most important part "Script". Why? Because i need to control all my game using this script so it as my game scene controller. Here are need to apply all the shape prefabs that i need to spawn, the voice prompt, all the text and UI, and all the SFX here.


So there are many scripts here that how i control my game using scripting. Apply scripting is more better for me to control my game went will need to do the action. The script also linking with different script so that the script know want 3D object i can and tap and what color shape will spawn in the Game.

6. UI Design


Here are the elements that i designed and created in Adobe Illustrator, based on some reference online, i try to come out the design myself that and apply to my game and make the game better.



  Previous Presentation Slide  

  Presentation Walkthrough Video  


Download The Game: Google Drive



  REFLECTION  

Experience
Developing this AR-based color shape game was a challenging rewarding but process. I began with setting up basic AR functionality using Unity and Vuforia, gradually evolving the project into a multi-scene interactive game. Initially, I faced technical hurdles with Unity Remote, prefab instantiation, and scene transitions. However, through persistence and step-by-step debugging, I was able to build core features like random shape spawning, audio prompts, timer mechanics, and score tracking.

The integration of Ground Plane tracking, prefab arrays, shape recognition via PlayerPrefs, and interaction scripts gave me practical experience in handling both the design and technical structure of an AR mobile game. Implementing modular scripts and UI controls taught me the importance of scalable architecture when working across scenes and shape types.


Findings
  • PlayerPrefs is crucial for passing selected data between AR Scene and Game Scene. Any mismatch in key names or values directly broke the shape logic.
  • Prefab consistency matters : All shape prefabs (cube, circle, etc.) must have the same components attached (eg, ShapeData, Collider, CubeTapHandler) to ensure they work uniformly during gameplay.
  • Ground Plane tracking worked better with PlaneFinderand tap-to-place interaction, especially on mobile devices, compared to just placing content immediately.
  • UI Timing Logic (Ready Countdown → Timer Start → Round Logic) needed to be carefully ordered to avoid skipping transitions or overlapping events.
  • Audio feedback enhances the playability significantly, but it requires a reliable mapping between correct color indices and sound clips.

Observations
  • Debugging on mobile was slower than on PC; small changes required frequent builds. Testing with Unity Remote was not always reliable for AR.
  • Modularity saved time . Separating logic into scripts like GameSceneController, CubeTapHandler, and ShapeTapConfirmallowed reusability across shapes and scenes.
  • Scene design impacts performance . Initially, having too many objects under ARCamera or missing anchoring caused shapes to follow the camera. Assigning objects to GroundPlaneStagecorrectly was key.
  • User guidance is critical in AR. Text like “Tap to Start” or “Tap the Red Color” helped direct player actions, especially during the first-time experience.
If not planned carefully, nested hierarchy confusion (eg, mixing anchor, canvas, and ground plane setup) can lead to unexpected behavior in AR scene rendering or object placement.





Comments