File size: 9,616 Bytes
27818c1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
import streamlit as st
import plotly.graph_objects as go
from streamlit_extras.badges import badge

# Set page configuration
st.set_page_config(
    page_title="Asimo Foundation | CV Journey",
    page_icon="πŸ€–",
    layout="wide",
    initial_sidebar_state="expanded",
)

# Title and introduction
st.header("πŸ€– Asimo Foundation - STEM Education")

st.markdown(
    """
### Bringing Technology Education to Public Schools

The Asimo Foundation is a social project at UNIFEI that aims to reduce educational inequality 
by bringing STEAM (Science, Technology, Engineering, Arts, and Mathematics) education to public schools in the region.

This initiative:
- Introduces students to robotics, programming, and technology
- Provides hands-on experience with Arduino, Lego Mindstorms, and ESP32
- Develops problem-solving and critical thinking skills
- Inspires interest in technology and engineering careers
"""
)

# Project details in tabs
project_tabs = st.tabs(["Mission & Impact", "Technologies", "Teaching Methodology"])

with project_tabs[0]:
    col1, col2 = st.columns([3, 2])

    with col1:
        st.markdown(
            """
        ### Our Mission
        
        The Asimo Foundation believes that all students, regardless of socioeconomic background, 
        deserve access to high-quality STEM education. By bringing technology education to public
        schools, we aim to:
        
        - **Bridge the digital divide** between private and public education
        - **Empower students** with technical skills for the future job market
        - **Inspire curiosity and innovation** in young minds
        - **Provide university students** with teaching experience and community engagement
        """
        )


with project_tabs[1]:
    col1, col2, col3 = st.columns(3)

    with col1:
        st.markdown(
            """
        ### Arduino
        
        **Applications:**
        - Basic circuits and electronics
        - Sensor integration (temperature, light, distance)
        - Simple robotics projects (line followers, obstacle avoidance)
        - LED control and displays
        
        **Benefits:**
        - Low cost and widely available
        - Excellent introduction to programming and electronics
        - Versatile platform with thousands of project examples
        """
        )

    with col2:
        st.markdown(
            """
        ### Lego Mindstorms
        
        **Applications:**
        - Robot construction and design
        - Visual programming introduction
        - Sensor integration and robotics concepts
        - Competitive challenges and problem-solving
        
        **Benefits:**
        - Intuitive building system
        - Robust components for classroom use
        - Engaging form factor that appeals to students
        - Scaffolded learning progression
        """
        )

    with col3:
        st.markdown(
            """
        ### ESP32
        
        **Applications:**
        - IoT (Internet of Things) projects
        - Wireless communication
        - Advanced sensing and control
        - Web-based interfaces
        
        **Benefits:**
        - Built-in Wi-Fi and Bluetooth
        - Powerful processing capabilities
        - Low power consumption
        - Bridge to more advanced applications
        """
        )

with project_tabs[2]:
    st.markdown(
        """
    ### Our Teaching Approach
    
    We follow a project-based learning methodology that emphasizes:
    
    1. **Hands-on Exploration:** Students learn by doing, building, and experimenting
    2. **Collaborative Problem-Solving:** Group projects that encourage teamwork
    3. **Incremental Challenges:** Starting with simple concepts and building to complex projects
    4. **Real-World Applications:** Connecting technology concepts to everyday life
    5. **Student-Led Innovation:** Encouraging creativity and independent thinking
    
    This approach ensures that students not only learn technical skills but also develop critical thinking,
    collaboration, and self-confidence.
    """
    )

st.markdown("---")

# Gesture-controlled robotic arm project
st.subheader("Featured Project: Gesture-Controlled Robotic Arm")

col1, col2 = st.columns(2)

with col1:
    st.markdown(
        """
    ### Computer Vision Meets Robotics
    
    This project combines computer vision with robotic control to create an intuitive
    interface for controlling a robotic arm using hand gestures.
    
    **How it works:**
    1. A webcam captures the user's hand movements
    2. MediaPipe hand tracking detects hand landmarks in real-time
    3. Custom algorithms convert hand position to servo angles
    4. Arduino/ESP32 receives commands and controls the servo motors
    5. The robotic arm mimics the user's hand movements
    
    This project demonstrates how computer vision can create natural human-machine interfaces
    and serves as an engaging introduction to both robotics and CV concepts.
    """
    )

with col2:
    # Placeholder for robotic arm image
    st.image(
        "assets/robotic_arm.jpg",
        caption="Robotic Arm used in the Asimo Foundation project",
        use_container_width=True,
    )

# Technical implementation details
st.subheader("Technical Implementation")

implementation_tabs = st.tabs(["Hand Tracking", "Angle Calculation", "Arduino Control"])

with implementation_tabs[0]:
    st.markdown(
        """
    ### MediaPipe Hand Tracking
    
    We use Google's MediaPipe framework to detect and track hand landmarks in real-time.
    
    **Key Technologies:**
    - [MediaPipe](https://developers.google.com/mediapipe) - Google's open-source framework for building multimodal ML pipelines
    - [MediaPipe Hands](https://developers.google.com/mediapipe/solutions/vision/hand_landmarker) - Specific solution for hand tracking
    - [OpenCV](https://opencv.org/) - Open source computer vision library
    
    **What it does:**
    - Detects up to 21 landmarks on each hand
    - Works in real-time on CPU
    - Provides robust tracking even with partial occlusion
    - Returns normalized 3D coordinates for each landmark
    
    **Resources:**
    - [MediaPipe GitHub](https://github.com/google/mediapipe)
    - [Hand Tracking Tutorial](https://developers.google.com/mediapipe/solutions/vision/hand_landmarker/python)
    - [OpenCV Documentation](https://docs.opencv.org/)
    """
    )

with implementation_tabs[1]:
    st.markdown(
        """
    ### Mapping Hand Position to Servo Angles
    
    Converting hand landmark positions to meaningful servo angles requires mathematical transformations.
    
    **Key Technologies:**
    - [NumPy](https://numpy.org/) - Fundamental package for scientific computing in Python
    - [SciPy](https://scipy.org/) - Library for mathematics, science, and engineering
    
    **What it does:**
    - Calculates angles between landmarks
    - Maps raw angles to appropriate servo ranges
    - Applies smoothing and filtering to reduce jitter
    - Converts 3D hand positions to robotic arm coordinate space
    
    **Resources:**
    - [NumPy Documentation](https://numpy.org/doc/stable/)
    - [SciPy Spatial Transforms](https://docs.scipy.org/doc/scipy/reference/spatial.html)
    - [Vector Mathematics Tutorial](https://realpython.com/python-linear-algebra/)
    """
    )

with implementation_tabs[2]:
    st.markdown(
        """
    ### Arduino Communication and Control
    
    The calculated angles are sent to an Arduino to control the servos.
    
    **Key Technologies:**
    - [pyFirmata2](https://github.com/berndporr/pyFirmata2) - Python interface for the Firmata protocol
    - [Firmata](https://github.com/firmata/arduino) - Protocol for communicating with microcontrollers
    - [PySerial](https://pyserial.readthedocs.io/en/latest/) - Python serial port access library
    - [Arduino Servo Library](https://www.arduino.cc/reference/en/libraries/servo/) - Controls servo motors
    
    **What it does:**
    - Establishes serial communication between Python and Arduino
    - Formats and sends servo angle commands
    - Controls multiple servo motors in the robotic arm
    - Provides real-time response to hand position changes
    """
    )

# Demo video
st.markdown("### Demo Video")
st.video(
    "assets/hand_control_arm_video.mp4",
    # caption="Demonstration of hand gesture-controlled robotic arm",
)
st.markdown(
    """
* [GitHub Repository](https://github.com/Fundacao-Asimo/RoboArm)
"""
)

# Educational impact
st.markdown("---")
st.subheader("Educational Impact")

st.markdown(
    """
### Learning Outcomes
- **Computer Vision Concepts:** Introduction to image processing, feature detection, and tracking
- **Robotics Fundamentals:** Servo control, degrees of freedom, coordinate systems
- **Programming Skills:** Python, Arduino/C++, communication protocols
- **Engineering Design:** System integration, calibration, testing

### Student Feedback
Students find this project particularly engaging because it:
- Provides immediate visual feedback
- Feels like "magic" when the arm responds to hand movements
- Combines multiple disciplines in a tangible application
- Offers many opportunities for creative extensions and customization
"""
)

# Footer with attribution
st.markdown("---")
st.markdown(
    """
### Project Team

This work was developed and implemented as part of the Asimo Foundation at UNIFEI.
Special thanks to all the volunteers, educators, and students who contributed to this initiative.
"""
)


st.markdown("[🌎 Asimo Foundation](https://www.instagram.com/fundacaoasimo/)")