BYS
AI fashion shopping app with personal styling, search, and virtual try-on
Project Details
Fashion shoppers are still asked to make deeply personal style decisions from generic catalog grids, static model photos, and blunt filters. They have to guess whether a color works on them, whether an outfit suits their taste, and where to save candidate looks while bouncing between retailers. That makes discovery feel manual and high-friction instead of curated.
BYS turns shopping into a guided mobile styling flow. Users sign in with Google or Apple, share a few preference signals, take a selfie for AI-assisted colour analysis, and browse a swipeable recommendation feed with natural-language search, wishlist actions, and retailer deep links. When they want more confidence, they can queue an asynchronous virtual try-on and return to a dedicated Trial Room when the result is ready.
What It Does
Passwordless Sign-In
The app starts with Google and Apple authentication, then stores and refreshes session tokens securely so returning users can get back into shopping without managing passwords.
Personal Style Intake
Onboarding captures gender, birth year, preferred brands, and push notification consent so the first feed is shaped by actual taste signals instead of a cold start.
AI Colour Analysis
A guided selfie flow uploads a compressed front-camera image for analysis, then reflects the detected palette back through an editable avatar for hair, eye, and skin tone review.
Semantic Discovery
Discovery supports natural-language product search, trending prompts, and saved filters across category, size, colour, and price so browsing feels closer to describing an outfit than filling out a form.
Swipe Recommendation Feed
The core feed turns likes, dislikes, and cart actions into new product batches, while local persistence protects the session if the app closes mid-request or mid-browse.
Trial Room Workflow
Users can upload up to six reusable photos, launch virtual try-ons from product detail, and review generated looks inside a dedicated Trial Room that opens directly from a push notification.
Highlights
What We Built
BYS is a consumer mobile app for shoppers who want clothing discovery to feel personal instead of algorithmically generic. The product combines guided onboarding, natural-language discovery, a swipe-based recommendation feed, and virtual try-on into a single mobile flow that moves from taste capture to purchase intent. On the client side, the app is built with Expo, React Native, TypeScript, Expo Router, and React Navigation, with Google and Apple sign-in layered directly into the onboarding path. A deployed Azure Container Apps backend handles the authenticated APIs for recommendations, search, account data, try-on requests, and push-token registration. The overall architecture is simple on paper, but it is built around a lot of stateful shopping behavior that has to feel reliable on a phone.
The Hardest Problems
The toughest challenge was not drawing screens, it was making personalization feel trustworthy without making onboarding feel like work. The app asks for just enough signal to personalize the feed, then has to compress, upload, and analyze a selfie reliably enough to support colour analysis and future try-on flows on unstable mobile networks. Virtual try-on adds another layer of difficulty because it is asynchronous, so the product has to reassure the user immediately, persist their context, and then route them back into the right place when a notification arrives. Authentication was also non-trivial because Google and Apple have different token lifecycles, and both flows needed secure storage plus refresh handling before any API call could be trusted. Even the feed itself needed defensive engineering, with cached state and recovery logic for interrupted recommendation requests and half-finished browsing sessions.
What We Learned
The strongest lesson in this codebase is that state resilience matters as much as interface polish in a shopping product. A fashion app can look premium, but if search context disappears, a swipe session resets, or uploaded photos get reordered unexpectedly, the experience stops feeling personal immediately. BYS leans heavily on AsyncStorage and SecureStore because continuity is part of the product promise, not just an implementation detail. Another clear takeaway is that AI features only feel valuable when they are tightly wrapped in concrete UX, like a better feed, an editable avatar, or a push-backed Trial Room. If this were being taken into the next phase, the biggest improvement would be to formalize the app state and service layer more aggressively so all of that resilience becomes easier to reason about and extend.
The Result
The result is a polished mobile shopping experience that does more than show products: it captures taste, narrows discovery, supports confidence-building try-ons, and sends users back to the right place when results are ready. The repository shows a mature release cadence, internal production build configuration, and a feature set that already spans authentication, personalization, search, recommendations, notifications, and retailer handoff. There is no public store URL checked into the codebase, but the app itself is clearly beyond prototype stage and structured like a product intended for repeat use. The guiding principle is visible throughout the project: personalization should remove decisions, not create more of them.
Like what you see?
Let's build your next product together.