Muirium wrote: ↑Xwhatsit's controller is a marvel, but not a perfect one. The single threshold value for every key in the keyboard is its Achilles Heel
Would you like a catbeep with that (coke)? (Referring to the ancient BIOS functionality of ThinkPad notebooks loudly beeping at you using the old-school PC beeper when you press down more keys than its eKRO membrane board can tell apart.)
Muirium wrote: ↑From what I've heard, the XTant is a bit of a wild child as far as Fs go. So it could be a particularly worthy adversary for a controller designer!
Ha! Bring your worst!
Nah, seriously, I am happy to cut my teeth into whatever you guys think would bring the most utility. I'm pretty uninformed on what is out there, just following this from an amateur EE (electrical engineer) perspective; but thought why not try to help out the community if it seems like a problem I might be able to tackle.
Question: but do we really need new hw or layout? What is the missing feature or clear requirement? Would it be solved by just updating the fw of xwhatsit which is open source or maybe what about porting it to tmk?
Xwhatsit's hardware is nice, but I remember him explaining how tight the memory constraints were that he was working within. So you're right, but fresh minds are likely to want to try their own hand at a hardware replacement as well as the firmware and software side, simply because it's easier to build something without constraints cramping you at every step like a low ceiling.
Also, a question as I don't understand his controller well enough to know the answer: Xwhatsit's controller uses a single threshold value for every switch on the keyboard, **is this a hardware limitation?** All I know is it's right there in the user facing result. Could be that there's not enough hardware to throw at the per-key threshold problem we'd like to see solved. I don't know where in the chain the limit lies.
xwhatsit controller uses ATMega32u2 - 8bit, 16 MHz, 22 I/O, 32k flash, 1k ram. it also needs two shift registers, two external comparators, and a DAC. The cost for all this is about $20 just for the parts.
A modern chip ... like MK64 family ARM would be 32bit, 120 MHz, 66 I/O, 1M flash, 256k ram. it probably would not need shift registers, external comparators, or DAC (all that is built in) and pay about the same for the parts.
But i think that the xwhatsit's easy user interface makes it more popular than DPH controller. IMHO, the xwhatsit and Aikon have excellent user interfaces. I think that if the UI is not this easy, you really start loosing people because it is too hard to program.
Thanks for support, guys Wasn't there to read it at the time, but now it's giving me some warmth, I must confess.
..yeah, I was feeling pretty bad for the last 2 weeks. I'm suddenly much better since about noon. Not a good sign, means there will be more swings. Won't be shitposting or deleting anything, please lower your guns.
But yeah, if I'm whining - disregard, consider it therapeutic in itself (quite effective, as that btw. But annoys everyone around you, so if you want to try it - use sparingly!)
Anyway. It was a way to remind everyone that if I suddenly disappear - the code is right there, it's always the latest version, you have my official permission to fork or do whatever you want with it. Very crude and inefficient way to say that, but still.
And yeah, a reminder to stay away from kinetis family for analog stuff.
Using ADCs on K20 is.. is like.. like eating shit with a large spoon. Sure, the spoon is quite large and the handle is somewhat convenient even. But it doesn't make your life better, y'know. Gory details:
Spoiler:
The key is NOT to look into application notes - there's severe risk of clawing out one's eyes. Especially when you discover there are _syntax_ errors in example code.
One can easily imagine some evil overlord laughing like a child saying "They want ADC? We'll give them one. Oh no, TWO. We'll even pretend they're 16 bit - but will make like 6 less significant of those below the noise floor (4 for 12-bit mode, so effectively 8 bits). We'll even give them 24 ADC inputs (and pretend there's 32 on each ADC!) - but only 3 of them will be accessible to both ADCs, and another 4 inputs will have -a pin and -b pin, multiplexed via _different_ configuration register, so you can't just select the 4a and then 5b - you'll have to reconfigure ADC every fscking time. And we'll even make PGA and diff inputs - because marketing wants - but PGA will work only on _one_ diff input, and another 2 inputs will support differential mode.
Also there's PDB - programmable delay block. Made specifically to drive sequencing ADC - doesn't look like it has any application anywhere else. BUT. It can only trigger conversion, and if you want to sequence 4a and 5b.. you're so out of luck - unless you use end-of-conversion interrupt, which will reconfigure ADC.. wait, why we need PDB then? Why not just drive ADC programmatically? Also, PDB cannot stop after, for example, 8 conversions. It runs either continuously or in a single-shot mode, which, considering above is utterly worthless.
eDMA engine kind of works - but because you need to wait for 3 microseconds after you read from the ADC results register (AFTER!!!) before you start the next conversion, or you'll read garbage, it's useless for my purposes. It's actually faster to busy-wait for the COCO flag (that's "conversion complete". The lesson here is not to invite marketing to naming technical stuff.) and then read the register.
And then wait for 3 microseconds before the next conversion. Don't you forget about that one.
Anyway. HuBandit, leave the idea of charging caps and then watching the voltage levels change. Come to the Dark Side! First dimmable LED in the room will screw up all your readings and it will be EXTREMELY HARD to account for it. But, if you'll manage to pull that off - I would really like to see how. Love to see complex riddles solved. This one, I can say from the height of all of 3 months of my experience with this stuff, is _complex_. See below why I think so.
Progress:
I made away with interlaced reads today - just scanning all the columns in a single pass turned out to be enough (will commit today/tomorrow - need to remove resulting hairballs from schemata and code, and describe what was there to warn future me not to go that way anymore). Interleaving is handled internally by switching ADC to a grounded pin every other step - and it works, debouncing is not needed!
BUT when you flick the dimmer lying on the table 1 foot from keyboard - it registers a keypress. And, immediately, release, because spikes dimmer produces are very, very thin. They look like this (channel 4 is "freestanding piece of wire connected to a probe", to show, for lack of the better word, "background radiation"):
DS1Z_QuickPrint5.png (42.66 KiB) Viewed 8892 times
as opposed to normal course of events (The only difference is that key is depressed on channel 2):
DS1Z_QuickPrint6.png (41.51 KiB) Viewed 8892 times
Notice that the Disturbance in the Force is barely wider than one ADC reading cycle - so you can have just one key from the whole row affected by it.
This can, most probably, be fixed by IIR filter triggering on rolling average of last, let's say, 16 samples. This will delay response by, worst case, 16 cycles (which, at 3029 (measured!) scans per second is ~5.3 milliseconds - twice that if using cheaper, single-ADC chip), while preserving order of key presses down to 300/600 microseconds (I'm lying here right now, because current USB send routine waits for ACK from the host for every packet it sends - this pauses everything for ~1ms. But nothing prevents me from making proper USB send routine or even moving the matrix scanner to ISR). Look at the pretty picture - that's what happens on keypress and release. I'm not a good touch-typist and cannot make a keypress less than 13 milliseconds long - so the picture is a bit crowded horizontally. Channel 2 is the adjacent key - this gets a pretty good idea of magnitude of the crosstalk. Channel 1 is actually the leftmost row at rest - it gets a HUGE signal boost because its' trace is so short.
DS1Z_QuickPrint4.png (79.14 KiB) Viewed 8892 times
/me also thinks about using a long PCB trace on a controller which is connected to midpoint of large-value resistor bridge (so that it's easily overwhelmed by external noise) and has 2 comparators connected to it - one sensing going over 2.5V+??mV, and one for going under 2.5V, and discarding data if any of the comparators triggered. This way even a very short spike will be detected. If comparators' input capacitance won't kill useful signal, that is. Directly sensing Disturbance in the Force is somewhat more elegant, but then if it originates on a farthest point from the controller, we can just miss it.
So IIR filter should be just fine (unless you're that HAM operator sitting directly under 1200W shortwave transmitter, of course. I don't really know what to do in his case, frankly. )
-=-
Mu, single threshold is a fundamental limitation of xwhatsit's controller architecture. It can be made per-column, but then you'll need to pause after each scan to let the DAC voltage stabilize - if you want to read out anything meaningful, that is Which will bring the scanrate down so much you would frequently get "qp" when you type "pq" (because it would be easy to press "p" after the "p" row is scanned and "q" after it but before the "q" row is scanned)
wcass, problem is not memory - while it's true you can use GPIOs and eschew shift registers, DAC+comparators is a fundamentally flawed approach (unless you have a DAC for every comparator(which is Luxury, I'm not aware of any SoC that has 8 DACs), but even then you somehow need to know trigger level, and you can only guess it - by conducting a series of experiments. Why, if you can just measure things directly?).
Also look at my kinetis rant. Maaay be K64 is better, but datasheet is suspiciously close to K20 in ADC part. Also not to forget you'll need those resistor matrices outside - they're cheap, but they're many.
As for memory - not making any effort to save memory (even copying all 2K of EEPROM to RAM on startup) I still use about 9K of RAM and ~33k of flash space. It's no longer a problem.
I completely agree with you on the user interface though. I've already kinda written one - (in Qt, same as xwhatsit's - so it should, in theory, at least, be portable - but right now I can't even run the app outside of Qt Creator for some reason) and it already saved me literally DAYS in debugging. (incidentally, it doesn't make much sense for the actual user currently - but it has a "resize keyboard" feature!).
If I had an actual xwhatsit controller in June - I would've been modified xwhatsits' utility. But since it plainly refused to run without a controller.. well.. I decided to learn Qt and copy interesting pieces from it
wcass wrote: ↑But i think that the xwhatsit's easy user interface makes it more popular than DPH controller. IMHO, the xwhatsit and Aikon have excellent user interfaces. I think that if the UI is not this easy, you really start loosing people because it is too hard to program.
Funny thing you say that, as I have been thinking about how to provide a good UI for configuration.
There are multiple layers to what a keyboard(cotroller) does:
sensing (e.g. raw keypresses)
translation (figuring out what to report to the computer based on what was sensed - macros, layers, etc.)
reporting to the computer (USB, BlueTooth, AT/PS/2 protocol, etc.)
Which of those would you want to configure? Translation seems quite obvious for sure - but would you want to fiddle with the other two? If yes, what would you like to adjust and why?
Last edited by HuBandiT on 16 Dec 2016, 10:27, edited 1 time in total.
HuBandiT wrote: ↑
Funny thing you say that, as I have been thinking about how to provide a good UI for configuration.
There are multiple layers to what a keyboard(cotroller) does:
sensing (e.g. raw keypresses)
translation (figuring out what to report to the computer based on what was sensed - macros, layers, etc.)
reporting to the computer (USB, BlueTooth, AT/PS/2 protocol, etc.)
Which of those would you want to configure? Translation seems quite obvious for sure - but would you want to fiddle with the other two? If yes, what would you like to adjust and why?
You can't do much about reporting. It defined by the interfaces you have. Well, if you theoretically have BT _and_ USB keyboard - then you can select, but I yet to see one.
translation is obvious (WITH LAYERS) - but there's KLL for that and it would probably be better to use/create KLL editor.
sensing - OH YES YOU WANT TO. Don't get me wrong - autoconfiguration is enough for 99% of the cases - but there must be a way to fine-tune thresholds, for example. Not applicable to normal switches - those are either open or closed. But capsense, hall effect sensors, those funny sound-delay-measuring sensors - they're all analog, so there are things to tune.
Well, there are a few things to tweak. E.g. do you want 8000 Hz or 1000 Hz or 125 Hz over USB (trade off latency vs. computer load and energy consumption), how to do boot protocol or not, any other potential quirks and non-standardness (e.g. the Linux-specific mapping Soarer had to do in his adapter).
DMA wrote: ↑Well, if you theoretically have BT _and_ USB keyboard - then you can select, but I yet to see one.
You soon might if Muirium and/or others chime in to that tune.
DMA wrote: ↑translation is obvious (WITH LAYERS) - but there's KLL for that and it would probably be better to use/create KLL editor.
I have only briefly looked into KLL, I yet have to grasp it. My approach would be to host an interpreted programming language in the keyboard, where the user can freely define and redefine behaviour to whatever they want, during the running of the keyboard.
DMA wrote: ↑sensing - OH YES YOU WANT TO. Don't get me wrong - autoconfiguration is enough for 99% of the cases - but there must be a way to fine-tune thresholds, for example. Not applicable to normal switches - those are either open or closed. But capsense, hall effect sensors, those funny sound-delay-measuring sensors - they're all analog, so there are things to tune.
Well, I cannot argue with all the effort and trouble you went through here. However, I would like to try for myself - maybe I'll get lucky.
BTW even for normal switches, you might want to configure thresholds - but only for fancy stuff like key press velocity estimation for regular mechanical switches.
A keyboard was ordered, development tools were ordered, hopefully I can soon get into motion after all this armchair philosophy.
Nope, smallest USB polling time is 1ms. And no, device cannot initiate the transaction.
HuBandiT wrote: ↑(e.g. the Linux-specific mapping Soarer had to do in his adapter).
"Linux mappings" arised from the fact that someone is too greedy and thinks the keyboard can realistically have more than 62 keys pressed (not including modifier keys). Who is this guy I do not know.
This lead to bitmap approach for NKRO mode. If you just extend the standard boot protocol message to 64 bytes (USB can't allow more!) - you'll get buffer space for 62 keys, AND don't have to implement the separate protocol - BIOS will actually ignore stuff beyond first 8 bytes. (for real boot mode that will mean a slight protocol violation, since you won't be setting eKRO when more than 6 keys is simultaneously pressed - but screw those guys, they had it coming. I'm not aware of _any_ case when you need more than 1 key at a time in BIOS pressed at once (well, if you type too fast you can probably get 2).
HuBandiT wrote: ↑You soon might if Muirium and/or others chime in to that tune.
and the battery will have to occupy all free space in model M case and discharge in 3.. 2.. 1.
HuBandiT wrote: ↑to host an interpreted programming language in the keyboard, where the user can freely define and redefine behaviour to whatever they want, during the running of the keyboard.
Well, there's Raspberry Pi for that. It would be actually interesting project to find out if RPi can be made bus-powered and then have a normal OS and tools in a keyboard. Keyboards with their own power supplies have gone out of fashion.
HuBandiT wrote: ↑BTW even for normal switches, you might want to configure thresholds - but only for fancy stuff like key press velocity estimation for regular mechanical switches.
If your mechanical switches have more than 2 states - they're seriously broken, replace them.
HuBandiT wrote: ↑A keyboard was ordered, development tools were ordered, hopefully I can soon get into motion after all this armchair philosophy.
Good luck! You'll find LOTS of stuff which gives you headache in lots of interesting ways.
Get a good scope - the most important is to have 4 channels (2 channels with separate trigger channel will do, but it costs about the same). You don't need fancy triggering - but you would find holdoff mode useful. I find DS-1054 quite fun to use (if only rigol made handles differently sized so you don't grab vertical position instead of multifunction control!)
- but it seems to have just a single PCB, with both the capacitive pads and the sensing circuitry on it. :S
P1040489_.JPG (272.48 KiB) Viewed 8796 times
Looks like I'll have to either hack it up (cut away the PCB part with the sensing circuitry - or at least cut the traces and graft my own connections on), or perhaps in fact procure a different model with circuitry not integrated onto the PCB if I want to develop a plug-on controller.
Any suggestions?
Last edited by HuBandiT on 04 Jan 2017, 16:32, edited 1 time in total.
I thought that was the point of the exercise--to develop a better alternative to the xwhatsit controller, which is a direct replacement for the Model F and beamspring capacitive sensors/controllers? If you're interested in developing a downstream option, isn't that more like the Soarer controller that remaps the keystrokes?
Sorry if I'm being naive-- this is obviously not my forte.
Techno Trousers wrote: ↑I thought that was the point of the exercise--to develop a better alternative to the xwhatsit controller, which is a direct replacement for the Model F and beamspring capacitive sensors/controllers? If you're interested in developing a downstream option, isn't that more like the Soarer controller that remaps the keystrokes?
Sorry if I'm being naive-- this is obviously not my forte.
The Model F and Beamspring keyboards have two circuit boards: one board underneath the keys forming a passive capacitive key matrix, and a second board containing the signal triggering and sensing circuitry and the controller (scanning and reporting). The two boards are connected through a board connector. Hence, when modernizing a Model F or a Beamspring keyboard, the most practical (and reversible) method is to disconnect and replace this entire second board; however, this forces the designer (xwhatsit and DMA) to have their own signal triggering and sensing circuitry on the replacement board together with the digital controller containing the key scanning and key reporting logic.
The community feedback was that there is demand for improving these signal triggering and sensing solutions. My intent therefore is to have an attempt at developing an improved signal triggering and sensing solution.
The particular keyboard specimen I acquired for this development however seems to include a signal triggering and sensing circuitry that is not removable (without damaging the keyboard), because it is integrated onto the same circuit board where the passive capacitive key matrix is. This is preventing me from developing my own version of signal triggering and sensing circuitry without irreversibly modifying (instead of just reversibly replacing/adding to) the keyboard.
Because the keyboard seems to be in relatively good condition mechanically and electrically (at least on visual inspection), it would be more valuable to preserve its function and use it as candidate for restoring to original condition (should anyone have an old system such that they can connect it to it), or for developing a modern controller/interface for it that relies on (instead of replaces) its integrated signal triggering and sensing circuitry. (That however would be a digital interfacing project, which is not the project I am pursuing here - although it might be if there is enough demand.)
Therefore, I am quite hesitant to irreversibly damage/modify this keyboard for my development purposes. Furthermore, even if I did decide to modify it for my development purposes, it might turn out to be sufficiently different in its characteristics from the original keyboards I intend to provide a signal triggering and sensing solution for so as to render my development effort performed on this keyboard - with the eventual goal of providing an alternative to xwhatsit's or DMA's solution - moot due to inapplicability to the original keyboards due to the differences between them and this development specimen.
Each EDA tool has its place within the Tao.
But do not edit PCBs in CircuitMaker if you can avoid it.
It's practically impossible to abandon the project which posesses you
It came to PCB stage. Debug PCB, because 30 GPIOs is not enough if you want differential channels (and you want to - good chance it will improve external noise immunity). You need 48 for that. Well, for beam springs you need 56 - don't have any of those laying around, but why not layout for it, may be I'll just find one laying in the gutter or something. The controller supposed to be universal, after all. Just a tiny strip of FR4 to convert to 0.12/0.2" connector pitch is all that's supposed to be needed for the more esoteric keyboards.
So, CircuitMaker. My favorite for now that it proclaims random PCB regions cursed, and if you try to lay a track there, it shows hourglass for like 5 minutes. For every click.
And it stores all your files "in the cloud". And insults you by assigning the title of "Maker" to your account - which you can't hide or change. Should've chosen kicad, despite horrible ugliness of the fonts. Will do for the release PCB. Just need to fix that kicad stroke font - my eyes bleed every time I see it
So. At first it looked like this:
pcb_preproto.png (159.07 KiB) Viewed 8765 times
Pretty. Makes autorouter so miserable it shoots itself in the head. And manual routing.. No thanks.
Now it looks like this:
pcb_proto.png (33.37 KiB) Viewed 8765 times
Autorouter failed even on the connector part. But it was somewhat easier to manually lay out.
Tomorrow I'll strip the silk from the texts (because it's a mess, at least around the connector) and figure out how to do those golden insignias on the PCB (If you sign your job - do it in style. Thinking of registering "Open Drain Hardware, LLC" even.). Then it goes to OSHPark, and BOM goes somewhere else. Then !!SCIENCE!! (and two of those, I hope, will find a place in my model MFs. Can't wait!)
Those pads besides the connector are resistors in series with the line. Or diodes between the line and the ground, to protect the controller from the 1200W transceiver guy. It's a debug PCB, and I'm doing my best not to order another debug version. Boy one-off things are expensive.
If diodes will make sense - will probably need to go four layers, because it's a bitch to layout 50+ GPIOs in limited space, let alone routing every one of them thru 3 discrete components, 2 of which are grounded.
But that all somewhere in March, may be even April. Going to Moscow for most of the February - to hunt down some very rare specimen to HaaTa, among other things. Then new job, I hope - which also takes time.
tentator wrote: ↑Hey DMA what are you up to here? I completely lost you so far!
What CPU is it then?
Trying to figure out if extra components will lead to any gain, and if that gain justifies extra cost. With standard cypress kit you don't want to run the keyboard near dimmable LEDs, especially when you're turning them on or off.
Hardware is still Cypress. Kinetis ADCs proven to be an UTTER crap, so not going there anymore.
Device 'PSoC 5LP CY8C5267AXI-LP051' was successfully programmed at 02/02/2017 22:28:56.
horror.jpg (518.45 KiB) Viewed 8674 times
This thing is surprisingly picky about quality of the juice. 7 capacitors is the bare minimum it wants to be programmed.
Boy I hate myself for choosing 0603 over 0805 SMD parts.
Butbutbutbut! I have USB bootloader flashed. Just need to solder USB on.
DMA wrote: ↑
..and already soldered one chip 180 degrees of what it should be. Too nervous
Don't worry, some flux and a hot-air gun will do the job. I've done it all in the past (soldered a QFP the wrong way, more than once -- after all, your chance of getting it right is one in four unless you think beforehand).
20170202_235658.jpg (229.23 KiB) Viewed 8716 times
We call this "rabid prototyping". Like all good things, it comes at a cost. HUGE cost.
20170202_235538.jpg (320.91 KiB) Viewed 8716 times
And this is when you order 200 pins, but zero sockets. But you have a breadboard. And a basic arduino kit you bought when that radioshack liquidated their store across the street.
IMG_20170202_235928.jpg (100.78 KiB) Viewed 8716 times
It actually works - USB, bootloader, firmware. Host utility talks to it. It even senses something if you poke those gold contacts with your fingers.
Tomorrow it would finally be time for !!SCIENCE!!.
That row of square pads supposed to have resistors soldered on. I have 100R, 1K and 10K ones - will see how it affects sensing - especially sensing of noise.
Will also try to solder antiparallel diode pair clipper between sense lines and ground to see how that helps with spikes.
All of this will 99% will not go in production, because controller with that will cost twice as much and will have _HUMONGOUS_ PCB, and will only help this guy (and prevent potentially-controller-killing static buildup on sense lines - though I hope that on-chip protection diodes will cover that).
Also need to look at current consumption - if it's low enough, may be it would be worth slapping battery and BT module on.
So. Any volunteers to loan me a beam spring keyboard for a couple of weeks/a month (ideally Feb ~25 to Mar ~7, I won't have much time later ) in exchange for the group buy not being model-F-controller-only? Don't worry, you'll get it back . With an s/n #0, autographed(if you so desire) beam spring controller even - for your type of beam spring, that is.
The worse it works with existing controllers the better.
I'm in Seattle. Can pick it up and drop off in a reasonable radius (~up to Bay Area). Otherwise you pay shipping and insure it for whatever you feel like. I'm starting to worry a bit about the cost of all this endeavor and shipping hunk of metal that large is ought to be expensive.
If there will be no volunteers - well, there will be no modern beam spring controller
The hardware will be opensourced as soon as I redraw it in KiCad (making it less hairy than it is now) - because friends don't let friends use altium circuitmaker.
[quote name=hbar]I wish I were that far.[/quote]
You probably have a life.
You probably don't want to give _that_ up for faster progress.
DMA wrote: ↑I'm in Seattle. Can pick it up and drop off in a reasonable radius (~up to Bay Area). Otherwise you pay shipping and insure it for whatever you feel like. I'm starting to worry a bit about the cost of all this endeavor and shipping hunk of metal that large is ought to be expensive.
If there will be no volunteers - well, there will be no modern beam spring controller
If I was somewhat closer to you I would volunteer. I hope someone on the west coast will be able to help. Great work BTW!
3 feet of horror.jpg (201.46 KiB) Viewed 8669 times
At the end of this vintage 2ft SCSI-2 cable made by "HUNG FU ELECTRONICS CO., LTD" lies the CommonSense controller rev. 1e-3.
It's working.
With 1 feet of the cable-to-keyboard wires and a LED dimmer wire lying under it, it's pretty impressive that the output is pretty stable.
It looks right now that differential signalling is not that much different from single-ended. Resting state is a bit below zero for the most of the keys. Level distribution is practically the same (only negated, so top-left has a hole now instead of the hill) - except couple of keys that look like they stuck in pressed state - only when you press them they go even higher.
The PCB was laid out by some idiot who arranged 8-pin parts of the connector in 4-6-2-1-3-5-7 order (i.e. yours truly). Took best half of the day to figure out pin mapping which is not a rats' nest and yet allows columns 0-8 to go to ADC0 and 9-17 to ADC1 (otherwise there wasn't enough analog buses - analog part of the chip has left and right halves internally, so there's AMUXBUSL and AMUXBUSR. They can be connected together if needed, but only one ADC can access the bus. This is baked in at design, so one needs to be careful).
Anyway, 3 resolderings and countless number of mouse clicks (because one can't edit schemata as text files, it's GUI-only) later..
Tomorrow is scoping time! For some reason I still feel excited every time I boot up the scope.
Diff sensing offers no significant advantage over SE in impulse noise immunity. Diff sensing with _N lines grounded at keyboard side is indistinguishable from SE.
If you try to read over 1m of ribbon cable you will only read zeroes - because the negative conductor will equalize with positive (there will be something on both - it's just *_P and *_N will have the same potential). SE works just fine - SNR is a bit worse because of cable capacitance, but otherwise..
Diff sensing presents it's own set of problems - rows 9 and 10 got very low differential signal on my set despite equal practically everything with rows 8 and 11, for example. SE is juuuust fiiine - and the difference is only firmware, specifically AMUX and ADC input configuration.
You can actually only ground chassis and leave PCB earths not connected - doesn't make any practical difference. Well, ground state will fluctuate a tiiiiiiny bit more there will be small amount of charge leaking to adjacent keys (negligible I'd say), but otherwise everything will stay the same.
Removing all those ground conductors will make routing easier. Unfortunately, single SSK-sized PCB is EXPENSIVE to produce, so feasibility of it will forever remain a mystery. However, I can scrape the existing F122 board as an experiment.
Because of the above I suspect that adding foam under the PCB will make SNR worse - will try to verify when my MF arrives. Because suspending the board in foam might help keep the noise down.
IIR 7/8 filter works pretty well. 3/4 also works fine unless you touch the cable with dimmer wire (7/8 shows some resilience even in those pretty harsh circumstances - it starts depending on the place on the ribbon).
So about the only way to improve immunity with long cable is to run a special sense line somewhere in the sense cable, scan it (syncronously!) by a separate ADC and discard the result if you read _anything_ on that special line.
Because 0.5us spike is all it takes, and it vanishes immediately - previous and next lines are completely unaffected.
It's better to have separate drive and sense cables for purity reasons, but you can even interleave them - it will just affect the ground state levels. For example, if you route drive 5 with sense 3 - you will have elevated (5, 3) cell level. For both states. Activation delta will be about the same, both levels FAR from 2V so wont' saturate anything.
Might as well start designing production PCB. Will wait for my MFs to arrive though so that prototypes are easier to test-fit. Also it looks like I'll have a 3277 and may be another beam spring specimen loaned to me, so there will be a beam spring version. If that loan pans out.
Let me say that DMA is rocking this out. I started and got stuck. He's owned it.
I'm now tracking and replicating. Can confirm that I'm able to scan all of the keys that I've hooked up to the devkit reliably. Now, the F122 isn't my target for this project... this is:
After seeing how reliable and consistent this thing is under abysmal conditions, I'm not scared in the slightest of design my own base PCB for this puppy...
DMA wrote: ↑Might as well start designing production PCB. Will wait for my MFs to arrive though so that prototypes are easier to test-fit. Also it looks like I'll have a 3277 and may be another beam spring specimen loaned to me, so there will be a beam spring version. If that loan pans out.
Packaged up... will go out on Monday when I return home. Take care of my children
There are 2 F122 PCBs in there as well. Feel free to consider them disposable if the need arises.
DMA wrote: ↑Might as well start designing production PCB. Will wait for my MFs to arrive though so that prototypes are easier to test-fit. Also it looks like I'll have a 3277 and may be another beam spring specimen loaned to me, so there will be a beam spring version. If that loan pans out.
Packaged up... will go out on Monday when I return home. Take care of my children
There are 2 F122 PCBs in there as well. Feel free to consider them disposable if the need arises.
Great news all around!
People voluntarily using my software, toys arriving (I won't be able to pick them up, but there are people left at home, so they won't just sit on the porch.
Now let's try to put that 13 hour layover in NY to good use.
Last edited by DMA on 08 Feb 2017, 12:45, edited 1 time in total.
DMA wrote: ↑Might as well start designing production PCB. Will wait for my MFs to arrive though so that prototypes are easier to test-fit. Also it looks like I'll have a 3277 and may be another beam spring specimen loaned to me, so there will be a beam spring version. If that loan pans out.
Packaged up... will go out on Monday when I return home. Take care of my children
There are 2 F122 PCBs in there as well. Feel free to consider them disposable if the need arises.