arpeggiator / README.md
stereoDrift's picture
Update README.md
9331a7b verified
metadata
license: mit
title: Arpeggiator
sdk: static
emoji: 🦀
colorFrom: purple
colorTo: pink
short_description: Hand-controlled arpeggiator, drum machine, and visualizer

Hand Gesture Arpeggiator

Hand-controlled arpeggiator, drum machine, and audio reactive visualizer. Raise your hands to raise the roof!

An interactive web app built with threejs, mediapipe computer vision, rosebud AI, and tone.js.

  • Hand #1 controls the arpeggios (raise hand to raise pitch, pinch to change volume)
  • Hand #2 controls the drums (raise different fingers to change the pattern)

Video | Live Demo | More Code & Tutorials

Requirements

  • Modern web browser with WebGL support
  • Camera access enabled for hand tracking

Technologies

  • MediaPipe for hand tracking and gesture recognition
  • Three.js for audio reactive visual rendering
  • Tone.js for synthesizer sounds
  • HTML5 Canvas for visual feedback
  • JavaScript for real-time interaction

Setup for Development

# Clone this repository
git clone https://github.com/collidingScopes/arpeggiator

# Navigate to the project directory
cd arpeggiator

# Serve with your preferred method (example using Python)
python -m http.server

Then navigate to http://localhost:8000 in your browser.

License

MIT License

Credits

Related Projects

I've released several computer vision projects (with code + tutorials) here: Fun With Computer Vision

You can purchase lifetime access and receive the full project files and tutorials. I'm adding more content regularly 🪬

You might also like some of my other open source projects:

Contact

Donations

If you found this tool useful, feel free to buy me a coffee.

My name is Alan, and I enjoy building open source software for computer vision, games, and more. This would be much appreciated during late-night coding sessions!

Buy Me A Coffee