Computer vision for recipe suggestions running on iOS π€·ββοΈ
π€³ποΈπ²
An augmented reality app that leverages iOS Core ML for running inference and bundling a pre-trained TensorFlow model into a lite version that is capable of operating efficiently on an iPhone device in real-time β without the need to query a server for food recognition predictions π€―
Once inferences / predictions have been made by the model, the app sends an array like [eggs, avocado, cheese] to the backend which then does a table query for recipes with those keywords (just like any normal search / lookup would), the recipes returned are then displayed in the cards popping up down the bottom of the screen for the user to open and read πββοΈ
Apple's ARKit is used for pinning an XYZ depth coordinate to the regions where food items have been detected, and does a pretty good job with tracking after some adjustments.