Bear in mind that all that refers to mostly to a reference from July 1985. The controller from your 42H1292 [
ref] has far fewer external components than on that logic diagram. Obviously stuff has changed a bit...
One thing I failed to discuss is the timing of the communication protocol (see
computer-engineering.org). Never mind what goes on inside your keyboard, if it can't talk to the computer properly you're stuffed. So everyone had better get it right (and do it the same way...).
Keyboards and computers communicate using two lines in binary states, Data (for... data) and Clock (for timing), each being high or low.
Normally (i.e. when transmission is possible), both Data and Clock are high.
When the keyboard starts talking, the Clock line is pulsed, it goes low then high then low then high etc.
The Data line is read by the computer each time the Clock goes low.
Each scan code (eight bits) is sent with a parity [error checking] bit, and surrounded with bits to mark the start and stop of transmission, so with eleven bits total this happens eleven times.
When the Clock is no longer required to be low, it goes high again to its normal state.
From the above-mentioned page:
"The clock frequency must be in the range 10 - 16.7 kHz. This means clock must be high for 30 - 50 microseconds and low for 30 - 50 microseconds."So, starting high, the Clock goes low 11 times and goes high 10 times during transmission. Therefore, the actual communication of a scan code to a computer must take max. 21 x 50µs = 0.00105 seconds. If you had it show up in blinkenlights, that'd probably be quite easy to miss...
However, there has to be some time between transmissions to keep them apart. Again, from the page:
"The Clock line must be continuously high for at least 50 microseconds before the device can begin to transmit its data.". The delay must be at
least 50µs, then.
So how long would it take to transmit, say, 50 keys mashed all at once read by an optimal matrix, scanned, buffered, translated and read at the other end by optimal controllers?
If we use the minimum possible time between transmissions, it would be 50 keys (21 x 50µs + (1 x 50µs)) = 55000µs = 0.055 seconds. This is unrealistic because:
Contact bounce in mechanical switches may necessitate some delay to ensure accurate detection
Either identifying or calculating the scan code takes time
Writing and reading the buffer takes time
Calculating and appending the parity bit takes time
Prepending and appending the start and stop bits takes time
And how much of the above four can be done in parallel? Are there any delays required as a result?
That said, with memory access times of nanoseconds rather than microseconds, and microcontrollers capable of multiple MIPS I fail to see any good reason for much electronic delay at all nowadays. Electromechanical delay seems far too complicated for me to bother considering right now and is surely also nominal anyway.
But, getting back to trying get a time for the 50 keystrokes, what would be a realistic interval between scan code data transmissions?
A very realistic interval would be a real one; get an oscilloscope on a "live" Data line and press two keys (or as many as you can get away with) as synchronously as possible. Unfortunately I don't have access to a 'scope. Any help there?
I had a look around the 'net for such information, but everyone just documents the protocol, shows how one scan code is sent and leaves it. The 7531 manual does the same, but also documents the host commands [stuff the computer can tell the keyboard do], included among which is the Typematic [repeat] Rate/Delay command (I actually already included a reference to this in the extensive manual quote above).
Trying not to go into too much detail (har), the Typematic Rate (which is the rate at which a keystroke is repeated following the first repetition (the delay before being the Typematic Delay)) is determined by the setting of five bits. The calculation given for this is:
Rate = (8 + A) x (2 ^ B) x 0.00417 seconds, where A and B are binary values of multiple bits (if you really care, I'll tell you what goes where).
To save any actual calculating, a table shows that setting all the bits low produces a Rate of 30.0 ± 20% make codes (effectively keystrokes for one byte [most] keys) per second, or approx. 3.336 (again, probably ± 20%) milliseconds per keystroke.
Furthermore, doing the typematic stuff likely "wastes" some of that time. As it's eight data bits you have send for that Typematic Rate/Delay command, I have to wonder if the 8 and the "A" bits in the calculation are related to each bit of make code, possibly making the 0.00417 the per-bit execution time (of course, no explanation is given...).
So.
Proof that the protocol can deal with sending at least 30 characters per second, and so could early controllers (albeit with the same key. They are still scanning for other keys in Typematic mode: Hold down a key, then hold down another key, then another...). Just how much quicker could modern controllers be? How much quicker might they have been even then (see above at "oscilloscope")?
If we can get the matrix right, n-key rollover within regular numbers of human digits is quite possible...