6/7/2025 - 27/7/2025 (Week 12 - Week 15)
Shawn Wong Kai Hen / 0375372
Major Project 1 / Bachelor of Design (Hons) in Creative Media
Experiential Design - Final Task
The integration of Ground Plane tracking, prefab arrays, shape recognition via PlayerPrefs, and interaction scripts gave me practical experience in handling both the design and technical structure of an AR mobile game. Implementing modular scripts and UI controls taught me the importance of scalable architecture when working across scenes and shape types.
- PlayerPrefs is crucial for passing selected data between AR Scene and Game Scene. Any mismatch in key names or values directly broke the shape logic.
-
Prefab consistency matters
: All shape prefabs (cube, circle, etc.) must have the same components attached (eg,
ShapeData
,Collider
,CubeTapHandler
) to ensure they work uniformly during gameplay.
-
Ground Plane tracking
worked better with
PlaneFinder
and tap-to-place interaction, especially on mobile devices, compared to just placing content immediately.
- UI Timing Logic (Ready Countdown → Timer Start → Round Logic) needed to be carefully ordered to avoid skipping transitions or overlapping events.
- Audio feedback enhances the playability significantly, but it requires a reliable mapping between correct color indices and sound clips.
- Debugging on mobile was slower than on PC; small changes required frequent builds. Testing with Unity Remote was not always reliable for AR.
-
Modularity saved time
. Separating logic into scripts like
GameSceneController
,CubeTapHandler
, andShapeTapConfirm
allowed reusability across shapes and scenes.
-
Scene design impacts performance
. Initially, having too many objects under ARCamera or missing anchoring caused shapes to follow the camera. Assigning objects to
GroundPlaneStage
correctly was key.
- User guidance is critical in AR. Text like “Tap to Start” or “Tap the Red Color” helped direct player actions, especially during the first-time experience.
Comments