Testing Brain-Computer Interfaces

Go test your skills at – http://go.tech/btcx

I recently made Wolverine Claws which are triggered by a deep learning model that uses machine vision to look at my facial expression. Now it’s time to test some brain-computer interfaces from openBCI to see how easy it would be to control robotics and prosthetics. I’ve managed to pinpoint specific places in my motor cortex, but I think we’ll need a slightly different electrode mount on a headset to use free thought to control anything.

Links from this video:


You can support me on Patreon or buy my Merchandise:
Patreon: https://www.patreon.com/xrobots
Merchandise: https://teespring.com/stores/james-bruton

Affiliate links – I will get some money of you use them to sign up or buy something:
Matterhackers 3D printing supplies: http://www.matterhackers.com?aff=7500
Music for your YouTube videos: http://share.epidemicsound.com/xrobots

Other socials:
Instagram: https://www.instagram.com/xrobotsuk
Facebook: https://www.facebook.com/xrobotsuk/
Twitter: https://twitter.com/xrobotsuk

CAD and Code for my projects: https://github.com/XRobots

Huge thanks to my Patrons, without whom my standard of living would drastically decline. Like, inside out-Farm Foods bag decline. Plus a very special shoutout to Lulzbot, Inc who keep me in LulzBot 3D printers and support me via Patreon.


Below you can also find a lot of the typical tools, equipment and supplies used in my projects:

Filament from: https://www.3dfuel.com/
Lulzbot 3D Printers: http://bit.ly/2Sj6nil
Lincoln Electric Welder: https://bit.ly/2Rqhqos
CNC Router: https://bit.ly/2QdsNjt
Ryobi Tools: http://bit.ly/2RhArcD
Axminster Micro Lathe: http://bit.ly/2Sj6eeN
3D Printer Filament: http://bit.ly/2PdcdUu
Soldering Iron: http://bit.ly/2DrNWDR
Vectric CNC Software: http://bit.ly/2zxpZqv

Why not join my community, who are mostly made up of actual geniuses. There’s a Facebook group and everything: https://www.facebook.com/groups/287089964833488/


Former toy designer, current YouTube maker and general robotics, electrical and mechanical engineer, I’m a fan of doing it yourself and innovation by trial and error. My channel is where I share some of my useful and not-so-useful inventions, designs and maker advice. Iron Man is my go-to cosplay, and 3D printing can solve most issues – broken bolts, missing parts, world hunger, you name it.

XRobots is the community around my content where you can get in touch, share tips and advice, and more build FAQs, schematics and designs are also available.

(Visited 1 times, 1 visits today)

Related Videos

Comment (30)

  1. Theoretically if someone were to use this and maybe change a few things up (or build an original version of something like this), do you reckon it'd be possible to Map the brain as it's Dreaming? If anything it'd be interesting to see how the brain changes comparing to Awake versus Asleep. (Though the Dream Mapping may itself just be what I'm curious about, A dream.)

  2. You said it your self… These are just single conductor electrodes that contact the skin. Thus, I don't know why you would think they "point" in any direction and thus have directional pickup capability.
    As for the moving of the arms: Try not waving the arms closer to the headset. Less noise. Sideways out from the body or simply making flat hand/fist with arms resting on lap works better.

  3. Unless I am missing something, you're acquiring EEG and sending it over Bluetooth to a PC; this is not BCI whatsoever. The ear clips are references (A1 and A2) and there are two of them for redundancy; your ground (FpZ) is the middle of your forehead. I suspect the clips connect an electrode and not directly to the ear, which sounds painful. You can use the snap leads and the electrodes you have to create a reference(s) on the back of the ears (mastoids). Electrode placement is based on the long established 10-20 system. Be mindful about covering clinical concepts without prerequisite training and experience.

  4. I have been doing EEG-based BCI research for just under 6 years now and there are a lot of errors in this video, but it’s a fairly good start.

    It wouldn’t be productive to nitpick everything in here, but know that you’re unlikely to get a good motor imagery-based system with more than two classes using the OpenBCI. You need to collect more controlled data. If you want to be making inferences on imagined movement, make sure you’re not moving your limbs but rather imagining movements (rotating a door knob works well). You should see a decrease in the mu rhythms over the sensory motor cortex during imagined movement. You’ll typically see a decrease in both C3 and C4 mu power but the mu power over the electrode contralateral to the imagined hand should be lower.

    If you wanted to use the activity from the actual muscle movements then you’ll have a much easier time using EMG electrodes on the arms since the signals are orders of magnitude more powerful and classification is trivial

  5. There is a cheap pcb which can handles 1s lipo charging (current can be set with a resistor) and has some additional features like low voltage cutoff. Just type "Tp4056 te420" into eBay 🙂

  6. poor pig, No moaning when the creator comes collecting his sheep or the aliens get there probs out…….remember we beam our cruelty into space ever day i dont think anything coming here is going to want to risk dieing even trying reasoning or talking


Your email address will not be published. Required fields are marked *