Demo Scripts
Hands-Free Spotify Control
This demo demonstrates how to control Spotify hands-free using jaw clench and eye movement detections with our real-time classifiers.
Key Features:
Play, pause, and navigate playlist on Spotify through simple, intuitive jaw and eye gestures.
Works seamlessly with our real-time classifier system for a hands-free music experience.
Prerequisites and Setup
Spotify Developer Setup:
Go to the Spotify Developer Dashboard and log in.
Create a new app. In the app settings, add a Redirect URI, e.g., http://localhost:8888/callback, for OAuth. Enable the “Web API” and “Web Playback SDK” permissions under scopes.
Retrieve your Client ID and Client Secret from the app settings for use in authentication.
Environment Setup: Make sure you have the spotipy library installed. You can install it using:
pip install spotipy
Steps
Add Your Spotify Credentials: Replace the placeholders in the script with your Spotify credentials as shown below:
sp = spotipy.Spotify(auth_manager=SpotifyOAuth( client_id="YOUR_CLIENT_ID", client_secret="YOUR_CLIENT_SECRET", redirect_uri="http://localhost:8888/callback", scope="user-modify-playback-state user-read-playback-state" ))
Define Music Control Functions: Use the Spotify Web API to create functions for music control actions, such as play, pause, next track, and volume control.
Create a Handler for Real-Time Predictions: Implement a handler function to connect classifier outputs (jaw clench and eye movements) to your music control functions. This function will interpret real-time predictions and map them to Spotify control actions.
Run the Demo: Open Spotify, run the script to start recording and activate live predictions, and control your music hands-free!
Source Code
1################ IMPORTS #############################################
2import asyncio
3from idun_guardian_sdk import GuardianClient
4import spotipy
5from spotipy.oauth2 import SpotifyOAuth
6
7#################### Configuration ####################
8# Recording settings
9RECORDING_TIMER = int(60 * 5) # 5 minutes
10LED_SLEEP = False
11
12# Device ID
13my_api_token = "INSERT_HERE"
14device_address = "INSERT_HERE"
15
16# Spotify configuration for music control
17sp = spotipy.Spotify(auth_manager=SpotifyOAuth(
18 client_id="INSERT_HERE",
19 client_secret="INSERT_HERE",
20 redirect_uri="http://localhost:8888/callback",
21 scope="user-modify-playback-state user-read-playback-state"
22 ))
23
24#################### Music control ####################
25def toggle_music():
26 """
27 Pause or start the music playback
28 """
29 playback = sp.current_playback()
30
31 if playback and playback['is_playing']:
32 sp.pause_playback()
33 else:
34 sp.start_playback()
35
36def next_track():
37 """
38 Play the next track
39 """
40 sp.next_track()
41
42def previous_track():
43 """
44 Play the previous track
45 """
46 sp.previous_track()
47
48#################### Handler functions ####################
49def pred_handler(event):
50 """
51 Handler for the live prediction data
52 """
53 # Jaw clench -> toggle playlist
54 if any('predictionResponse' in key for key in event.message):
55 jaw_clenches = event.message["predictionResponse"]
56 binary_jaw_clenches = [0 if jaw_clench == 'Nothing' else 1 for jaw_clench in jaw_clenches]
57
58 if any([pred > 0 for pred in binary_jaw_clenches]):
59 print('Jaw clench detected', flush=True)
60 toggle_music()
61
62 # HEOG -> switch track
63 if any("heogClassificationsResponse" in key for key in event.message):
64 heog = event.message["heogClassificationsResponse"]
65
66 if any([pred == 1 for pred in heog]):
67 print('Left HEOG detected', flush=True)
68 previous_track()
69
70 if any([pred == 2 for pred in heog]):
71 print('Right HEOG detected', flush=True)
72 next_track()
73
74
75#################### Main ####################
76if __name__ == "__main__":
77
78 # Create client
79 client = GuardianClient(api_token=my_api_token, address=device_address)
80
81 # Subscribe to live predictions
82 client.subscribe_realtime_predictions(jaw_clench=True, fft=False, bin_heog=True, handler=pred_handler)
83
84 # Start the recording
85 asyncio.run(
86 client.start_recording(
87 recording_timer=RECORDING_TIMER, led_sleep=LED_SLEEP, calc_latency=False
88 )
89 )
90
91 # Prints and downloads
92 rec_id = client.get_recording_id()
93 print("RecordingId", rec_id)
Control Philips Hue Lights with our Eye Movement Classifier
This demo demonstrates how to turn on/off Philips Hue lights by looking left and right.
Key Features:
Turn on and off Philips Hue lights by looking left and right.
Works seamlessly with our real-time classifier system to control Philips Hue state using brain power.
Prerequisites and Setup
Install the Philips Hue SDK phue: :
pip install phue
Install the IDUN Guardian SDK: :
pip install idun_guardian_sdk
Philips Hue Bridge Setup: :
Connect your Philips Hue Bridge to your network and find its IP address.
Press the button on the bridge to pair it with your network.
Retrieve the bridge IP address.
Configure the IP address and username of the bridge in the script.
BRIDGE_IP = 'YOUR_BRIDGE_IP' BRIDGE_USERNAME = 'YOUR_BRIDGE_USERNAME'
Connect to the Philips Hue Bridge: :
bridge.connect()
Define Light Control Functions: :
def handle_left_eye_movement(): lamp = "Ceiling Lamp 1" bridge.set_light(lamp, 'on', True) time.sleep(2) bridge.set_light(lamp, 'on', False) def handle_right_eye_movement(): lamp = "Ceiling Lamp 2" bridge.set_light(lamp, 'on', True) time.sleep(2) bridge.set_light(lamp, 'on', False)
Create a Handler for Real-Time Predictions: :
def handle_eye_movement(data): prediction = data.message for p in prediction['heogClassificationsResponse']: if p == 1: handle_left_eye_movement() if p == 2: handle_right_eye_movement()
Connect to the IDUN Guardian Device and start data streaming: :
client = GuardianClient() client.subscribe_realtime_predictions(bin_heog=True, handler=handle_eye_movement) asyncio.run(client.start_recording(recording_timer=RECORDING_TIMER))
Source Code
1import asyncio
2import time
3
4from phue import Bridge
5
6from idun_guardian_sdk import GuardianClient
7
8"""
9# Install phue
10pip install phue
11
12# Install the idun guardian sdk
13pip install idun-guardian-sdk
14
15# Replace 'BRIDGE_IP_ADDRESS' with the IP of your Hue Bridge
16bridge = Bridge('BRIDGE_IP_ADDRESS')
17
18# First-time connection: Press the button on the Bridge and run this code
19try:
20 bridge.connect()
21 print(f"Connected! Save this username for future use: {bridge.username}")
22except Exception as e:
23 print(f"Error connecting to the Bridge: {e}")
24
25# Subsequent connections
26# Replace with the IP address of your Bridge and your previously saved username
27bridge = Bridge('BRIDGE_IP_ADDRESS', username='YOUR_SAVED_USERNAME')
28
29# Now you can control the lights without pressing the button again
30bridge.set_light(1, 'on', True)
31"""
32
33BRIDGE_IP = ''
34BRIDGE_USERNAME = ''
35
36bridge = Bridge(BRIDGE_IP, username=BRIDGE_USERNAME)
37bridge.connect()
38
39RECORDING_TIMER: int = 60 * 15 # set a recording length in seconds
40
41
42def handle_left_eye_movement():
43 lamp = "Ceiling Lamp 1"
44 bridge.set_light(lamp, 'on', True)
45 time.sleep(2)
46 bridge.set_light(lamp, 'on', False)
47
48
49def handle_right_eye_movement():
50 lamp = "Ceiling Lamp 2"
51 bridge.set_light(lamp, 'on', True)
52 time.sleep(2)
53 bridge.set_light(lamp, 'on', False)
54
55
56def handle_eye_movement(data):
57 prediction = data.message
58 for p in prediction['heogClassificationsResponse']:
59 # debug: 1 = left, 2 = right, 0 = none
60 # print(p)
61
62 if p == 1:
63 handle_left_eye_movement()
64
65 if p == 2:
66 handle_right_eye_movement()
67
68
69if __name__ == '__main__':
70 client = GuardianClient()
71 client.subscribe_realtime_predictions(bin_heog=True, handler=handle_eye_movement)
72 asyncio.run(client.start_recording(recording_timer=RECORDING_TIMER))
73 rec_id = client.get_recording_id()