Author Topic: infosec risk in custom keyboard deployment  (Read 1778 times)

0 Members and 1 Guest are viewing this topic.

Offline admiralvorian

  • Thread Starter
  • Posts: 321
  • Location: United States
  • DIY
infosec risk in custom keyboard deployment
« on: Fri, 12 May 2017, 12:19:04 »
one day my coworker had just walked away from his desk and I had the realization that it would be very easy to flash his ergodox with malicious firmware. I haven't gone so far as to actually write a proof-of-concept, but the potential is obvious... obvious enough that this has gone so far as to result in a complete ban of keyboards with custom controllers (among other peripherals) and the creation of an 'approved peripherals' list at my workplace.

well, that totally sucks. anyone have ideas on how to make a keyboard controller that can't be re-flashed once written? can a teensy be modified in a physical way to accomplish this (does it have a fuse soemwhere? can i cut some traces?), or would I have to make my own PCB with a ROM on it?

Darude Status:
☐ Not Sandstorm
☑ Sandstorm                                               wts wtt wtb

Offline Phenix

  • Posts: 533
  • Location: Germany
Re: infosec risk in custom keyboard deployment
« Reply #1 on: Fri, 12 May 2017, 12:47:44 »
desolder the pushbutton of the teensy..

Sure, 'if one is serious about he will just desolder the whole board and replace it with an faulty teensy. but that takes time.
Winter is coming.

Offline davkol

  • Posts: 4629
  • Location: CZ
Re: infosec risk in custom keyboard deployment
« Reply #2 on: Fri, 12 May 2017, 12:56:08 »
In that case, any device with upgradeable firmware would have to be banned.

Anyway, what would be the malicious function of a keyboard controller? A keylogger? I doubt it, if the controller is an ordinary Teensy or Pro Micro, because the on-board memory is tiny.

Furthermore, this is such an obscure attack vector, that it's very unlikely to be used in practice. There are bigger fish to catch.

Offline Findecanor

  • Posts: 3460
  • Location: Stockholm, Sweden
Re: infosec risk in custom keyboard deployment
« Reply #3 on: Fri, 12 May 2017, 12:58:17 »
A firmware can not be loaded onto a Teensy 2.0 purely through malicious code running on the host, unless the firmware has support for it from the start - but I have heard of no firmware that does that and I see no reason why it should.

A few firmwares allows a key combination to enter the bootloader, which will allow replacing the firmware over USB.
But if that is not available, then you would need to open up the keyboard and physically press the reset button or short the reset pin to GND to start the bootloader.
The only way to disable the bootloader would be to replace it with your own. I am not sure that PJRC's bootloader is able to overwrite itself but there is also a raw programming interface that will require access to the ATmega32u4's chips.

In other words, you would need some level of physical access to the keyboard or the keyboard controller to be able to replace the firmware.
But at least, these are all documented and well known, and you have access to your firmware's source code.
Commercial peripherals may also have a physical programming interface or some proprietary bootloader that you have no idea about.
"It is no measure of health to be well adjusted to a profoundly sick society."
Daily driver: Phantom (Lubed Cherry MX Clear, Lasered Cherry PBT keycaps with Row A. Plastic "Frankencase". Custom firmware, Swedish layout)

Offline Findecanor

  • Posts: 3460
  • Location: Stockholm, Sweden
Re: infosec risk in custom keyboard deployment
« Reply #4 on: Fri, 12 May 2017, 13:07:25 »
desolder the pushbutton of the teensy..
All the push button does is short the RESET line to GND. The Teensy 2.0 board also has a pin to the RESET line, so that you could solder on an external reset switch.
... and even if that was not available, you could reach in there and short the pins on the chip package directly.

But yes, the best solution might be to desolder that switch ... and then encase the entire controller board on the ErgoDox PCB in epoxy (putty).
"It is no measure of health to be well adjusted to a profoundly sick society."
Daily driver: Phantom (Lubed Cherry MX Clear, Lasered Cherry PBT keycaps with Row A. Plastic "Frankencase". Custom firmware, Swedish layout)

Offline Phenix

  • Posts: 533
  • Location: Germany
Re: infosec risk in custom keyboard deployment
« Reply #5 on: Fri, 12 May 2017, 13:49:14 »
desolder the pushbutton of the teensy..
All the push button does is short the RESET line to GND. The Teensy 2.0 board also has a pin to the RESET line, so that you could solder on an external reset switch.
... and even if that was not available, you could reach in there and short the pins on the chip package directly.

But yes, the best solution might be to desolder that switch ... and then encase the entire controller board on the ErgoDox PCB in epoxy (putty).

yea, but it would make it harder.. a bit.
isnt it the same with safes? you pay for the extra time your goods are safe compared to just an shelf.
Winter is coming.

Offline admiralvorian

  • Thread Starter
  • Posts: 321
  • Location: United States
  • DIY
Re: infosec risk in custom keyboard deployment
« Reply #6 on: Fri, 12 May 2017, 13:52:50 »
In that case, any device with upgradeable firmware would have to be banned.
not really, my cell phone for instance will not accept any firmware without the appropriate cryptographic signature. there are attacks against this im sure, but it's another fence.

Quote
Anyway, what would be the malicious function of a keyboard controller? A keylogger? I doubt it, if the controller is an ordinary Teensy or Pro Micro, because the on-board memory is tiny.
on a keylogger attack just offload the data, i just saw a thread about a 32u4 on a BLE board. or simply wait for some context that makes you somewhat confident that the following keystrokes will be a password (like a known username) and then only log that.

how about different attack though? malicious mouse/keyboard input triggered by a certain key combination - I could exfil private ssh keys with one command pretty easily.

Quote
Furthermore, this is such an obscure attack vector, that it's very unlikely to be used in practice. There are bigger fish to catch.

just because it's low risk doesn't mean it's zero risk. it was determined to be an unacceptable risk by the Decision Makers, especially when the solution is so easy.
Darude Status:
☐ Not Sandstorm
☑ Sandstorm                                               wts wtt wtb

Offline admiralvorian

  • Thread Starter
  • Posts: 321
  • Location: United States
  • DIY
Re: infosec risk in custom keyboard deployment
« Reply #7 on: Fri, 12 May 2017, 13:59:32 »
so a teensy can't be put in bootloader mode without shorting that pin to ground? I know in TMK there is also a firmware keystroke trigger, so:


  • rip off rst button and encase in resin/plastic after building it and
  • make your final firmware lacking in a bootloader mode trigger

this seems like it'd bring a teensy to a level similar to a mass produced keyboard, or better since the firmware is open source.

I still find myself wondering what kind of money it would take to mfr replacement 'static' controllers in a batch of 100 or so...
Darude Status:
☐ Not Sandstorm
☑ Sandstorm                                               wts wtt wtb

Offline Phenix

  • Posts: 533
  • Location: Germany
Re: infosec risk in custom keyboard deployment
« Reply #8 on: Fri, 12 May 2017, 14:06:46 »
the primary question is WHICH scenario is it.
Is it your coworker/boss (physical) or an cybercriminal (online)

QMK/TMK can be programmed per button press (its code RESET, but it can be deactivated IIRC)
Winter is coming.

Offline davkol

  • Posts: 4629
  • Location: CZ
Re: infosec risk in custom keyboard deployment
« Reply #9 on: Fri, 12 May 2017, 14:20:03 »
Any USB device may be programmed to alter its function depending on circumstances, see, e.g., BadUSB. The stock controller may not be reflashable, but it's quite straightforward to put another microcontroller between the device and the host (like hasu's USB-to-USB converter).

Offline Findecanor

  • Posts: 3460
  • Location: Stockholm, Sweden
Re: infosec risk in custom keyboard deployment
« Reply #10 on: Fri, 12 May 2017, 14:27:20 »
yea, but it would make it harder.. a bit.
isnt it the same with safes? you pay for the extra time your goods are safe compared to just an shelf.
Well, in this case the point is not theft but that it would become difficult to remove the encasing without damaging the board itself.

BTW, there are other microcontrollers that you can program so that they can not be reprogrammed in any way, or even that parts of the EEPROM is not reprogrammable in case you would want to ship it with a proprietary library.
The ATmega32u4 does not have that AFAIK. I think there are Freescale controllers that do something like that  though and those are supported by a few open source keyboard firmwares.
"It is no measure of health to be well adjusted to a profoundly sick society."
Daily driver: Phantom (Lubed Cherry MX Clear, Lasered Cherry PBT keycaps with Row A. Plastic "Frankencase". Custom firmware, Swedish layout)

Offline Phenix

  • Posts: 533
  • Location: Germany
Re: infosec risk in custom keyboard deployment
« Reply #11 on: Fri, 12 May 2017, 14:33:20 »
its interesting seeing your toughts about it.
nevertheless I think if an coworker wants to read out data he will add an logger (either on pc or between USB and keyboard)
and an hacker would use an trojan or an logger straight away.
just mx 2 cents
Winter is coming.

Offline Data

  • Posts: 2361
  • Location: Orlando, FL
Re: infosec risk in custom keyboard deployment
« Reply #12 on: Fri, 12 May 2017, 14:35:27 »
What's the scenario, exactly?  What "malicious firmware" do you imagine would be appropriate in this case?

The Teensy microcontrollers (and most or all Atmega32u4-based keyboards, as far as I know) have significant memory limitations that would make this attack vector very unattractive for a hacker.  But I could be wrong on that I suppose.  I'll have to look into that.

Online ika

  • Posts: 653
  • Location: NE Ohio
Re: infosec risk in custom keyboard deployment
« Reply #13 on: Fri, 12 May 2017, 14:36:26 »
Wouldn't it just be easier to hide a physical keylogger inside the case? This seems overly complicated for a possible security concern.

Offline Tactile

  • Posts: 828
  • Location: Portland, OR
Re: infosec risk in custom keyboard deployment
« Reply #14 on: Fri, 12 May 2017, 15:08:48 »
It really seems a bit silly to worry about malicious code being flashed to a keyboard and completely disregard the computer, with networking, USB ports, BIOS flash capability, hard drive firmware updates, etc. just a few feet away. In most cases if a bad guy has physical access to the keyboard they also have physical access to the computer and, in that case, all bets are off - even if you unplug the keyboard and take it with you.

Offline Phenix

  • Posts: 533
  • Location: Germany
Re: infosec risk in custom keyboard deployment
« Reply #15 on: Fri, 12 May 2017, 15:27:38 »
Wouldn't it just be easier to hide a physical keylogger inside the case? This seems overly complicated for a possible security concern.

it would be enough to add one between usb port and cable, providing its an back port.
In normal office-use this wont even get detected (but of cause inside case is better).

Winter is coming.

Offline vvp

  • Posts: 651
Re: infosec risk in custom keyboard deployment
« Reply #16 on: Fri, 12 May 2017, 15:47:41 »
If attacker has physical access then there is no security any more whether the keyboard is closed/commercial or whether it is an ergodox.

But if one cares then it is possible to make it harder for an attacker even with ergodox. Just disable bootloader boot in fuses and one can simply erase the bootloader itself just to be double sure. But it does not change that much. Attacker can come with HW debugger/programmer. And if the MCU supports permanent disabling of debugging support then attacker can just come with a hot-air station and replace the breakout board or the controller itself.

Offline admiralvorian

  • Thread Starter
  • Posts: 321
  • Location: United States
  • DIY
Re: infosec risk in custom keyboard deployment
« Reply #17 on: Tue, 16 May 2017, 16:53:58 »
if you're commenting here just to say "that's stupid, you don't need to worry about it" you have entirely missed the point. "If an attacker has physical access nothing matters anymore"

 I guess I'll just never patch my system, lock my screen, use good passwords, use password managers, etc. because all bets are off if an attacker gets physical access.

whether or not it's stupid, whether or not it's something that needs to be worried about, it's something that IS being worried about by my infosec team. any keyboard with good macro support (not just custom ones) carries more risk than the standard dumb commercial board.

yea, but it would make it harder.. a bit.
isnt it the same with safes? you pay for the extra time your goods are safe compared to just an shelf.
Well, in this case the point is not theft but that it would become difficult to remove the encasing without damaging the board itself.

BTW, there are other microcontrollers that you can program so that they can not be reprogrammed in any way, or even that parts of the EEPROM is not reprogrammable in case you would want to ship it with a proprietary library.
The ATmega32u4 does not have that AFAIK. I think there are Freescale controllers that do something like that  though and those are supported by a few open source keyboard firmwares.

^^ this is the type of information I'm looking for. comments and suggestions on how i can make a custom keyboard that beats the restriction of "it can't be re-programmable"

 
Darude Status:
☐ Not Sandstorm
☑ Sandstorm                                               wts wtt wtb

Offline dubious

  • 키랩 홍보대사
  • Posts: 255
  • Location: Layer 8
Re: infosec risk in custom keyboard deployment
« Reply #18 on: Tue, 16 May 2017, 17:20:50 »
lol managers...
« Last Edit: Tue, 16 May 2017, 17:24:04 by dubious »

Offline Tactile

  • Posts: 828
  • Location: Portland, OR
Re: infosec risk in custom keyboard deployment
« Reply #19 on: Tue, 16 May 2017, 17:36:01 »
If your infosec people are worried about a villain reflashing a keyboard with malware, these things will give them nightmares.

Offline dubious

  • 키랩 홍보대사
  • Posts: 255
  • Location: Layer 8
Re: infosec risk in custom keyboard deployment
« Reply #20 on: Tue, 16 May 2017, 17:41:48 »
ya and definitely don't tell them hackers can hear what you are typing lol  :eek:

http://www.berkeley.edu/news/media/releases/2005/09/14_key.shtml

Offline vvp

  • Posts: 651
Re: infosec risk in custom keyboard deployment
« Reply #21 on: Tue, 16 May 2017, 18:21:13 »
Tell your infosec team to flash known firmware and remove the bootloader from the ergodox and then they should be happy.

If they will not feel happy after doing that then ask them first whether all the USB ports are either removed or filled with glue. The same with fire-wire ports and PCIe expansion ports (if one can get to them) because these provide DMA access to memory. Are the computer cases sealed?  It would be also good idea to protect the network cables. And don't forget that mobile phones with camera should be banned because one can really easily make screenshots with them or record sounds. People entering the office should be searched for suspicious hardware in general.

Offline admiralvorian

  • Thread Starter
  • Posts: 321
  • Location: United States
  • DIY
Re: infosec risk in custom keyboard deployment
« Reply #22 on: Thu, 18 May 2017, 16:37:09 »
Tell your infosec team to flash known firmware and remove the bootloader from the ergodox and then they should be happy.

If they will not feel happy after doing that then ask them first whether all the USB ports are either removed or filled with glue. The same with fire-wire ports and PCIe expansion ports (if one can get to them) because these provide DMA access to memory. Are the computer cases sealed?  It would be also good idea to protect the network cables. And don't forget that mobile phones with camera should be banned because one can really easily make screenshots with them or record sounds. People entering the office should be searched for suspicious hardware in general.

we actually do some* of these things ya snarky little snot

*can't say which
Darude Status:
☐ Not Sandstorm
☑ Sandstorm                                               wts wtt wtb

Offline vvp

  • Posts: 651
Re: infosec risk in custom keyboard deployment
« Reply #23 on: Fri, 19 May 2017, 01:54:43 »
Some of them are not enough. It is all low hanging fruit for an attacker.

Edit: Especially if your company has such a strict security that it cannot somewhat trust it's own employees than they need to search employees when entering premises for suspicious hardware. They need to track everything people do on computers and audit the logs regularly. And they should have security cameras in the offices and review them too.

But obviously when they allowed your coworker to bring in ergodox then they do not care that much about hardware people bring into the offices. And if that is so then discussing about whether somebody can flash ergodox firmware looks more like a security theater than any real security. Well, at least the theater will make some people happy.
« Last Edit: Fri, 19 May 2017, 02:07:46 by vvp »

Offline vvp

  • Posts: 651
Re: infosec risk in custom keyboard deployment
« Reply #24 on: Fri, 19 May 2017, 02:28:39 »
Just to give you my point straight to you.

It is not about bringing in ergodox which allows easy firmware update if the bootloader is not erased/disabled. It is about bringing in any electronics hardware in general. Whether it looks strange like ergodox or whether it looks like a common off the shelf keyboard or some other peripheral. Because although the hardware can look OK from the outside its electronics can be designed to do also something unauthorized.

If you would have any real security (instead of just a security theater) then your coworker would not be able to bring in ergodox at all without your famous infosec group first reviewing and the ergodox hardware and software. And during that process they could disable the ergodox bootloader easily and you would not need to post here at all :)

Offline Leslieann

  • * Elevated Elder
  • Posts: 1701
Re: infosec risk in custom keyboard deployment
« Reply #25 on: Fri, 19 May 2017, 02:32:29 »
So long as the chip can be accessed, you can flash it.

It may take de-soldering or just some alligator clips, but it can be done. It's how you flash the "secure" bios on the Lenovo X230, which is supposed to be locked/secured. Some people do it while the chip is installed.
Filco MJ2 L.E. w/Vortex case, hand milled case, custom feet, custom paint, Klaxxon key caps, lubed and o-ringed Jailhouse Blues made from vintage Cherry MX Blues, HID Liberator, stainless steel universal plate, 3d printed adapters, removable cord, sound dampened. Winkey blockoff plate | Magicforce 68 w/Outemu Blues |KBT Race S L.E. w/Ergo-Clears, custom WASD keyset | Das Pro w/browns (Costar model) | IBM Model M (x2)

Offline vvp

  • Posts: 651
Re: infosec risk in custom keyboard deployment
« Reply #26 on: Fri, 19 May 2017, 03:33:22 »
Yes, it can but you already need a hardware device for it (either debugger/programmer or a hot-air station (or some similar soldering equipment)). And if the employees are disallowed to bring in such a hardware device then they cannot flash it if bootloader is not present or is disabled in fuses.

I'm do not know how many chips are there which do not allow re-flash at all. Those probably only contain a PROM and not FLASH. My guess is the almost all allow re-flashing and that would be true for off-the-shelf keyboards too. It is just too convenient for production. A lot of MCUs allow you to disable reading of the firmware in MCU but typically you can erase it and then flash a new one.

So if somebody is disallowed to bring in ergodox with disabled bootloader then he should be disallowed to bring in any hardware peripheral (because one can just reflash firmware or replace the MCU in it and bring it modified in such a way).

That is the point I'm trying to send admiralvorian across. If they already allowed bringing an ergodox and connecting it then the only thing they should require him to do is to disable the bootloader. They may want to review the firmware too. And if that is not enough for admiralvorian then he is not consistent since they already allowed him to bring in the hardware. Which is a big NO-NO in high security places since then the employee would be able to bring in e.g. the keylogger Tactile posted. In other words, his infosec team is making quite a nice security theater in the company if they want to inconvenience people because of ergodox (with disabled bootloader).

Offline kolec94

  • Posts: 100
Re: infosec risk in custom keyboard deployment
« Reply #27 on: Fri, 19 May 2017, 13:04:41 »
how are they stopping you from using a approved device?
is it more just people or are they blocking it on the machine?

kbparadise v60 blues

Offline admiralvorian

  • Thread Starter
  • Posts: 321
  • Location: United States
  • DIY
Re: infosec risk in custom keyboard deployment
« Reply #28 on: Wed, 24 May 2017, 16:53:12 »
Yes, it can but you already need a hardware device for it (either debugger/programmer or a hot-air station (or some similar soldering equipment)). And if the employees are disallowed to bring in such a hardware device then they cannot flash it if bootloader is not present or is disabled in fuses.

I'm do not know how many chips are there which do not allow re-flash at all. Those probably only contain a PROM and not FLASH. My guess is the almost all allow re-flashing and that would be true for off-the-shelf keyboards too. It is just too convenient for production. A lot of MCUs allow you to disable reading of the firmware in MCU but typically you can erase it and then flash a new one.

So if somebody is disallowed to bring in ergodox with disabled bootloader then he should be disallowed to bring in any hardware peripheral (because one can just reflash firmware or replace the MCU in it and bring it modified in such a way).

That is the point I'm trying to send admiralvorian across. If they already allowed bringing an ergodox and connecting it then the only thing they should require him to do is to disable the bootloader. They may want to review the firmware too. And if that is not enough for admiralvorian then he is not consistent since they already allowed him to bring in the hardware. Which is a big NO-NO in high security places since then the employee would be able to bring in e.g. the keylogger Tactile posted. In other words, his infosec team is making quite a nice security theater in the company if they want to inconvenience people because of ergodox (with disabled bootloader).

I'm not sure you understand what's going on here;

It's plain that there is no real-world, practical benefit coming from the ban of custom keyboards. Your posts are just arguing with the wind.

My point is that, regardless of merit, the decision has been made here. I'm looking for interesting ways to comply with the decision and still have my fun.
Darude Status:
☐ Not Sandstorm
☑ Sandstorm                                               wts wtt wtb

Offline MajorKoos

  • Posts: 286
  • Location: Bay Area
  • 1 life please. Extra large.
Re: infosec risk in custom keyboard deployment
« Reply #29 on: Wed, 24 May 2017, 18:57:03 »
Yes, it can but you already need a hardware device for it (either debugger/programmer or a hot-air station (or some similar soldering equipment)). And if the employees are disallowed to bring in such a hardware device then they cannot flash it if bootloader is not present or is disabled in fuses.

I'm do not know how many chips are there which do not allow re-flash at all. Those probably only contain a PROM and not FLASH. My guess is the almost all allow re-flashing and that would be true for off-the-shelf keyboards too. It is just too convenient for production. A lot of MCUs allow you to disable reading of the firmware in MCU but typically you can erase it and then flash a new one.

So if somebody is disallowed to bring in ergodox with disabled bootloader then he should be disallowed to bring in any hardware peripheral (because one can just reflash firmware or replace the MCU in it and bring it modified in such a way).

That is the point I'm trying to send admiralvorian across. If they already allowed bringing an ergodox and connecting it then the only thing they should require him to do is to disable the bootloader. They may want to review the firmware too. And if that is not enough for admiralvorian then he is not consistent since they already allowed him to bring in the hardware. Which is a big NO-NO in high security places since then the employee would be able to bring in e.g. the keylogger Tactile posted. In other words, his infosec team is making quite a nice security theater in the company if they want to inconvenience people because of ergodox (with disabled bootloader).

I'm not sure you understand what's going on here;

It's plain that there is no real-world, practical benefit coming from the ban of custom keyboards. Your posts are just arguing with the wind.

My point is that, regardless of merit, the decision has been made here. I'm looking for interesting ways to comply with the decision and still have my fun.

I can totally grok the paranoia.  I would not trust someone else's custom keyboard either.  A ban on custom keyboards isn't actually a big thing.  The serious players supply all the kit for you to use and make you hand over all your stuff and stick it in a locker before you're even allowed in through the door.  They also have a camera looking over your shoulder as you use their machine to do your work.  When you start working in sensitive environments securing the supply chain for your equipment becomes a priority. 

Removing the bootloader and using an epoxy to prevent tampering may help, but you'd need to get infosec or IT do do it for you for attestation purposes.
« Last Edit: Wed, 24 May 2017, 19:29:14 by MajorKoos »

Offline Findecanor

  • Posts: 3460
  • Location: Stockholm, Sweden
Re: infosec risk in custom keyboard deployment
« Reply #30 on: Thu, 25 May 2017, 06:03:03 »
When I worked in a security-sensitive environment for a few months the rule was that all peripherals had to be approved by the IT department beforehand. I could perhaps have got my Ducky TKL through the approval procedure but the issued Sun Type 6 was good enough.
"It is no measure of health to be well adjusted to a profoundly sick society."
Daily driver: Phantom (Lubed Cherry MX Clear, Lasered Cherry PBT keycaps with Row A. Plastic "Frankencase". Custom firmware, Swedish layout)

Offline TerryMathews

  • Posts: 209
Re: infosec risk in custom keyboard deployment
« Reply #31 on: Fri, 26 May 2017, 12:46:45 »
if you're commenting here just to say "that's stupid, you don't need to worry about it" you have entirely missed the point. "If an attacker has physical access nothing matters anymore"

 I guess I'll just never patch my system, lock my screen, use good passwords, use password managers, etc. because all bets are off if an attacker gets physical access.

whether or not it's stupid, whether or not it's something that needs to be worried about, it's something that IS being worried about by my infosec team. any keyboard with good macro support (not just custom ones) carries more risk than the standard dumb commercial board.

yea, but it would make it harder.. a bit.
isnt it the same with safes? you pay for the extra time your goods are safe compared to just an shelf.
Well, in this case the point is not theft but that it would become difficult to remove the encasing without damaging the board itself.

BTW, there are other microcontrollers that you can program so that they can not be reprogrammed in any way, or even that parts of the EEPROM is not reprogrammable in case you would want to ship it with a proprietary library.
The ATmega32u4 does not have that AFAIK. I think there are Freescale controllers that do something like that  though and those are supported by a few open source keyboard firmwares.

^^ this is the type of information I'm looking for. comments and suggestions on how i can make a custom keyboard that beats the restriction of "it can't be re-programmable"

Any commerical keyboard that has backwards compatible support for PS/2 is equally at risk. I can build a Hasu PS/2-USB out of a $3 pro micro and put it in the case.

Of course I wouldn't have time to do all that custom work at your location, but if I knew what kind of keyboard you used I could order one as a test bed to develop my keylogger.

If that cable is detachable internally and I can source male and female ends for it, I could compromise your keyboard in less than a minute and if I set the VID and PID correctly your PC wouldn't know the difference.

As another posted alluded, the larger issue is that the ATmega32u4 doesn't have the space to store much outside of the keyboard software, and doesn't really have the speed to bit-bang SD card support AND control a keyboard at the same time.

Now if I'm a professional spy agency, all bets are off. A Cortex M32 would have the speed to do keyboard and SD simultaneously or a Pi Zero has a built-in SD card slot.

Personally, I'd put banning custom keyboards on the same level as TEMPEST shielding: a theoretically plausible solution for an unlikely attack vector.

Re: infosec risk in custom keyboard deployment
« Reply #32 on: Tue, 30 May 2017, 13:11:10 »
Interesting dilemma.

I've worked in environments ranging from super casual (bring your own laptop, put it on the network, and work from that if you like) to the strictest imaginable. I've even done some red team work attempting to gain unauthorized access to them.

Bottom line is that if your infosec team is *that* concerned about this sort of attack vector... they shouldn't be allowing any outside tech in, regardless of whether or not it is re-programmable... Especially where custom hardware is concerned. They'd have to have a subject matter expert on hand to inspect and verify each device.

Allowing *any* device from outside is an unnecessary risk, really. I've seen modified "standard issue" devices that were confiscated (e.g. a dell keyboard with a keylogger built-in and a microsoft mouse that had effectively been turned into a USB storage device).

Offline Koren

  • Posts: 110
  • Location: France
Re: infosec risk in custom keyboard deployment
« Reply #33 on: Tue, 30 May 2017, 18:30:44 »
obvious enough that this has gone so far as to result in a complete ban of keyboards with custom controllers (among other peripherals) and the creation of an 'approved peripherals' list at my workplace.

whether or not it's stupid, whether or not it's something that needs to be worried about, it's something that IS being worried about by my infosec team. any keyboard with good macro support (not just custom ones) carries more risk than the standard dumb commercial board.
Does it, really?

First, commercial keyboards usually have flashable firmware too... I know Logitech have. I haven't looked into the possibility of putting an hacked firmware in a commercial keyboard, but should I be a spy, I think I'd spent some time in this. There's far more target that could be hacked because of a common commercial keyboard than because of an Ergodox...

Beside, I hope your infosec banned all wireless devices? Because if you want a common weakness, commercial wireless keyboards and mouses are really common way to hack someone.

Se for example Mousejacking:

A 15$ antenna, and you can hack a lot of people in a 100-200m radius that use a wireless radio mouse...


^^ this is the type of information I'm looking for. comments and suggestions on how i can make a custom keyboard that beats the restriction of "it can't be re-programmable"
Well, commercial keyboards are re-programmable ;)

If your infosec people are worried about a villain reflashing a keyboard with malware, these things will give them nightmares.
That's actually a real threat...

I used one of those to demonstrate my school how their practices were dangerous. They asked teachers to log in in the classroom to enter people that weren't there. Problem, even if there's a "in-classroom" mode with limited access, it's the same password. And the password protect everything, including marks and result in exams.

Worse, computers are enclosed in desks, and it's a nightmare to check whether there's a dongle between the keyboard and the computer. In a couple of minutes, a student can install it, but you can't spent 5 minutes moving the PC each time you log in to check if there's no dongle.

Anyway, what would be the malicious function of a keyboard controller? A keylogger? I doubt it, if the controller is an ordinary Teensy or Pro Micro, because the on-board memory is tiny.
It's sufficient, though...

Just record the 20-30 characters that follow a couple of words (like su, ssh, root, admin or a set of logins) and even with a couple hundred bytes of memory, you should be able to get passwords.


Maybe I'm paranoid, but each time I use a password on a public computer (internet-cafes, for example), I open a notepad alongside the browser, and I use mouse between each keystroke. So that the characters are entered in the wrong order on the keyboard, some are put in the notepad, some in the browser.

So they may now that my password is 15 characters among 30, in a different order. Good luck to anyone using keyloggers. It's a hassle, but it makes me feels better ;)

Offline davkol

  • Posts: 4629
  • Location: CZ
Re: infosec risk in custom keyboard deployment
« Reply #34 on: Wed, 31 May 2017, 03:25:16 »
People use an on-screen keyboard for that.

Offline Koren

  • Posts: 110
  • Location: France
Re: infosec risk in custom keyboard deployment
« Reply #35 on: Fri, 02 June 2017, 19:11:16 »
If it's available, yes, but that's not always a given (at least in some places I visited)