Our First Try at AR. Here’s What We Learned
Augmented Reality (AR) has been on our radar for a while, but this was our first real attempt to build something from scratch. We wanted to explore what it actually takes to create an AR experience people can hold in their hands—and whether all the hype lives up to the reality.
Here’s what we discovered.

Why We Decided to Try AR
We’ve all seen AR in action, whether it’s trying on glasses virtually, seeing how furniture fits in a living room, or catching digital creatures in games. It’s no longer just a novelty; it’s becoming part of how brands, educators, and creators connect with audiences.
We decided to challenge ourselves to build a simple AR prototype: a 3D object that appears on any flat surface and responds to user interaction.
How We Built Our First AR Prototype
We chose Unity with AR Foundation because it supports both iOS (AR Kit) and Android (AR Core), letting us test across devices without starting over each time.
Our process looked like this:
- Concept – A floating, animated object that the user could place in their environment and rotate by touch.
- 3D Asset – We sourced a model from Sketchfab (a free repository of 3D objects).
- Development – We built the scene in Unity, set up plane detection, anchors, and interaction scripts.
4.
Testing – We deployed to smartphones and tablets to see how the experience performed in real environments.
We quickly learned that even a simple AR app involves more detail than it seems.
What Worked Well

Real-time plane detection felt almost magical. Watching the system map out surfaces and place objects with accuracy was exciting.
Interaction was intuitive. Touch controls to rotate and scale the object made the experience feel responsive.
Deploying to devices was faster than expected. Once the build pipeline was set up, iterating became easier.
What We Struggled With
Lighting and realism were harder than we expected. Our object often looked out of place because the lighting didn’t match the room. We learned that light estimation (using AR Kit/AR Core features) is critical to blend virtual and real elements convincingly.
Surface tracking wasn’t perfect. In low light or on shiny floors, plane detection would flicker or fail entirely. We had to experiment with different environments to get stable tracking.
Performance can be tricky. Even modest 3D models and shaders can slow down lower-end phones, so optimising assets was essential.
What We’d Do Differently Next Time
- Start with the basics. Our first trial tackled complex aspects instead of building from a solid foundation. rt even simpler. Our first ambition was too big. Next time, we’d focus on a single interaction or visualisation first.
- Use baked lighting and simpler materials. High-quality PBR textures look great in demos but can kill frame rates on real devices.
Test early and often. Simulators don’t tell you the whole story; real-world testing is the only way to catch tracking and lighting issues.
Final Thoughts
Trying AR for the first time was eye-opening. It’s powerful, but it demands careful planning, design, and testing to feel refined. We walked away with a deeper appreciation for the teams that make seamless AR experiences look effortless.
If you’re considering your own AR project, our advice is: start small, test in real spaces, and don’t be afraid to experiment.
Do you have an AR project you want to work on? Send us a DM at: hello@childcreativestudio.com, let’s collaborate!

Leave a Reply