hatmanstack
Blog Post
b7926b3
metadata
title: React Native Serve Ml
emoji: 🔥
colorFrom: red
colorTo: blue
sdk: docker
pinned: false
license: mit

React Native App with Expo

This repository contains a static React Native application built using Expo with FastApi and Docker for deployment. It's a single container that's running here. It's serving several diffusion models that use the huggingface inference-api. The code uses baseline components to demonstrate deployment techniques for ML endpoints. The root repository and alternative deployments are here. A blog post explaining this deployment and the HuggingFace Inference API can be found here.

Installation

To generate the static content for this container have a working Node/npm installation and clone this github repo. In your huggingface space Settings add your HF_TOKEN variable as a secret.

npm install -g yarn
yarn
npx expo export:web

Static files are output to the web-build folder in the root directory. Replace the web-build folder in the Huggingface-Space directory with the web-build folder in the root directory. Once that's done, the static content of the app will be updated. The Huggingface-Space directory can be deployed as a single container.In a single container to reach the endpoint from the axios call use "http://localhost:/api" not "http://localhost:8081/api".

License

This project is licensed under the MIT License

Acknowledgments

  • This application is built with Expo, a powerful framework for building cross-platform mobile applications. Learn more about Expo: https://expo.io