Perhaps one of the most competitive markets for truly micro electronics is hearing aids . This general market has grown enormously with ear-buds . I admit to being personally interested because my hearing is the most casualty of my now over 3 4 % ( I like the percent sign for division as it originally evolved ) century . This seems a natural market for Forth . Is it used by any of the many manufacturers ? What processors do they use ? I am also particularly interested because I see it as a natural use for CoSy's unique APL vocabulary in open Forth . While I was in visual psychophysics at Northwestern ( overlapping Dave Jaffe's time there in biomedical engineering ) , and that is what impelled me to learn APL , the same sort of high dimensional math applies to all sensory modalities . I would desire in audio space to create a Volterra style map of amplitude x frequency space which can handle inter-modular distortions as well as harmonic , adjustable by the user thru a smart phone app . Again , intelligent ear buds seem a natural market for Forth's minimalism . Is it a player ? Bob A | --
As it turns out, a hearing aid company was one of IntellaSys' customers. I don't think it ever resulted in any actual product being sold. The key was the low power, the equivalent conventional DSP chip*
solution caused the lights to dim when the code was executed. | -- The only mention I could quickly conjure with Google was from the end of IntellaSys:
As far as I know only one person (Michael) on the TPL payroll is
Do you have a web link for TPL?
Thanks! | --
I'm sorry Google is broken at your end. 😎🗿| --
DSP and Forth are two of my interests, so I remember a hearing aid product was discussed in the context of the GA144. What I recall is that the hearing aid company/researchers had a rather unique algorithm, much more complex than the standard equalizer type algorithm, that was implemented on a pair of TMS320C6200 type processors. These devices are intended for the highest performance on 32 bit data for use in cell base stations where power is not a big issue. So they used a lot of power. A similar algorithm was implemented on the GA144 at a much lower power level. The GA144 can use up to about 1W with all processors maxed out, so potentially not something that would run from a hearing aid battery. The devil is in the details. The question is if this algorithm could ever be tamed enough to get the power level to a number for running on the 160 mAh cell typically used. I think my friend's hearing aids run about two days of continuous use. That comes to 3 mA or about 4.2 mW. One F18A processor running flat out uses 4.5 mW. So to be effective as a hearing aid, the sum total of the 144 processors in the GA144 would need to be running at a duty cycle equivalent of a single F18A. That gives the putative 700 MIPS which is a maximum for certain core instructions. Still, that's a lot. How does it compare to the TMS implementation? One TMS320C6200 family member can run at up to 8000 MIPS with a more typical number from the time in question being 2000 MIPS. So two of them would make 4000 MIPS available. So could 700 MIPS, scattered across 144 processors manage to do what ~4000 MIPS was required to do in the TMS320C6200s? I don't know. It would be a mean trick of engineering I expect. So it may explain why the product never came to market. As a comparison I used to track the unique processors that are often used for hearing aids. These processors are not general purpose machines, rather uniquely tailored to processing sequential data in the same algorithm on every sample, much as would be done in a FIR filter... or many FIR filters. It has been a while since I've noted the speeds so I don't recall how they compare, but I'm sure they are nowhere near 700 MIPS. A quick search turns up some parts that have an ARM with three specialized coprocessors providing a claimed 375 MIPS, at around 1.2 mW. Notice this is several times more power efficient than the F18A. Rick Collins | --
Rick , thank you for your expertise . What sort of chips are available in form factors which fit in ear buds ? I would guess max ~ 16mm^2 . I always see chips a > 1cm^2 and boards ~ 6 or greater in the Forth discussions I see . One would hope the actual acoustic energy would far dominate the electronics . Your mention of FIRs required me to look it up : https://en.wikipedia.org/wiki/Finite_impulse_response which triggered me to click https://en.wikipedia.org/wiki/Kronecker_delta with a side trip to https://en.wikipedia.org/wiki/Dirac_delta_function and another to https://en.wikipedia.org/wiki/Iverson_bracket which is new to me . A number of things from his 1962 book were winnowed before the vision ever managed to be implemented . Brackets became indexing . which overall brings me to the reality that the output is 2 temporal flows . As I described , I think of the problem in terms of functions on , distortions of , the frequency x amplitude surface . Expanding one region , perhaps contracting another . But then , bottom line that needs to be transformed back to a temporal filter function . That's a bunch of math I'd only learn by implementing in CoSy , improving CoSy in the process . I'd see doing the ` tuning on one's smart phone , the essentially table lookup uploaded to the buds . I see the definition of the CoSy APL vocabulary in scalar sequential Forth sort of like a Fourier transform between the Dirac&Kronecker delta implementation and the conceptual holistic object transformation domain . Thanks again for the response . You guys know so much of the infinity that I don't . Bob A | -- Peace thru Freedom
Honesty enforced thru Transparency ,
|