Monday, September 19, 2011

MakerFaire NYC 2011

I went to MakerFaire at the NY Hall of Science, Queens, NY on September 17th and 18th, and took plenty of photos. Like last year, the location was the site of the 1964 World's Fair. Even though I grew up pretty close to New York, I didn't get to see the World's Fair as a child, so I'm glad to have these opportunities to see what's left of it.
As was true last year, there were lots of 3D printers and CNC milling machines. My impression this year was that a much larger percentage of them were hobbyist efforts rather than high-end commercial projects. I think that's a good thing. There seems to me to be a maturing of the 3D printer hobbyist effort in general, and the gradual emergence of more small businesses like Bre Pettis's Makerbot. The hard technical challenges (the big one being getting the extruder nozzle to work just right) have pretty much been identified now.

As I looked at some of the products, which have been improving in resolution, it occurred to me that an interesting approach would be, instead of going with increasingly fine nozzles, to use a coarse nozzle to place a slightly oversized drop of plastic, let that drop cool and harden, and then bring in a milling tool to shape it. This would mean moving back and forth frequently between the extruder nozzle and the milling tool, so it would need some tinkering and might not end up being an improvement.

There were lots of other tools, things on display, and cool stuff to see. I was really impressed with an elegant (if low-res) volumetric display called Lumarca. Essentially the guy uses a projector to project colors onto lengths of monofilament fishing line in the viewing volume, and by carefully controlling the projected image, he individually controls what colors appear along each length of monofilament.

There were a few interesting vehicles, like motorized skateboards and a Segway clone. Those were fun.

One thing I found interesting was that in addition to the expected Arduino stuff (which has the full weight of O'Reilly Publishing behind it), there were a good number of boards with more advanced microcontrollers, particularly ARM Cortex-M3 controllers. This interests me because with their larger address spaces and fuller feature sets, ARM processors can run Linux OSes or Python interpreters or other big pieces of code  beyond the itty-bitty programs that will fit on an Arduino. Teho Labs had a nice line of Cortex M3 boards. They shared space with Dangerous Prototypes who were showing off their Web Platform board.

I did get pretty tired and sore walking around so much, and needed some Advil. But it was definitely worthwhile. Maybe I'll have some project next year so that I can have a booth of my own.

Sunday, September 11, 2011

We need to build more educational computers

When I was a kid, I had this absolutely wonderful educational computer called Digi-Comp. It was very simple, with only three bits of state, and that was fine for learning an awful lot of basic stuff about computers. And it was sturdy as anything. I must have disassembled and reassembled it hundreds of times and it never broke and never stopped working. Somebody needs to design a 21st century mechanical computer using an inexpensive service like Ponoko or Shapeways or 100kGarages or Big Blue Saw to do laser cutting or 3D printing.

When I was in college, I saved up money to buy a KIM-1 single-board computer with a 6502 microprocessor, a hexadecimal keypad, and six seven-segment digit displays. I learned a lot of what became my career playing with that thing, writing little assembly language programs, soldering TTL chips to it, and generally having a great time. When I left school, I bought an old ASR-33 teletype from the school's department for retiring obsolete junk, and used it to give the KIM-1 a 300 baud line printer. Back in those days we had extraordinarily low thresholds of entertainment. Still, it was educational.

We need to be building more of this kind of stuff today. The things we build need to be easily hackable and easy to form user communities around. I guess you could say we already have something like the KIM-1 with today's Arduino, but it never fills me with intrigue like the KIM-1 did, where you were literally typing in machine opcodes onto that hex keypad, and they're showing up in the LED digits. It gives you a real sense of intimacy with the entire process of computation. Compiling C code never gets you quite that close to the action.

If you regard the Digi-Comp and the KIM-1 as two points along a spectrum of sophistication, we probably ought to plan on a third more advanced point, given that the KIM-1 is about 30 years old now. I've been doing some puttering with ARM-7 and ARM-9 boards of various kinds, some so capable as to run Linux, and I think that's a good third point because that's the kind of hardware that appears in modern consumer devices.

Friday, August 26, 2011

Random notes for 26 August 2011

I don't have anything individually notable happening lately but I thought I'd talk about a few different things. Yesterday lunchtime I talked to some folks at Harvard Medical School who are cooking up a great little open source project, and I'm hoping to contribute to it. Their idea is to use Blender to make it simple and easy and quick for medical researchers to put together animations involving multiple proteins interacting. How things have been so far is that making an animation is a colossal hassle (I know this from experience) and usually the researcher is way too busy doing science, so any animation would be done by public relations folks as a means to communicate the research to the non-scientific public.

If animation were so quick and easy and painless that the researcher could do it himself or herself, then researchers could share animations with one another, modify another researcher's animation, append commentary or publication references or other metadata (I am envisioning something like a git repository with forks), and animations would become more than just a PR tool. They would become a part of the active scientific literature, and they would make it possible for researchers to dig deeper into problems, to have more detailed and nuanced communications with one another, and ultimately for better science to be done.

They've decided to attack the problem of simulating and animating several proteins (or other similar sized structures like cell membranes) interacting simultaneously. Most software tools in this area are designed to deal with only one large molecule at a time. But disease processes often involve interactions. Think about viral self-assembly where a bunch of pieces come together to form a protein shell around some RNA. Think about ligands binding with receptors. These involve two or more large molecules, and you need to keep track of their position, their orientation, their mechanical properties, their electrostatic properties, their relative linear and angular momenta. It's a yummy area of inquiry, and done correctly, this could be a significant advance to science, in an area that might easily be dismissed as "pretty pictures".

So yeah, I'm pretty psyched about their project. But there is other news. I'm saddened to learn that Steve Jobs is stepping down as Apple's CEO, presumably for health reasons. It's a sad sad thing that we don't have a better handle on cancer, AIDS and other big diseases. As an engineer in a society that considers itself advanced, I find it a little embarrassing that we've done so poorly. I'm also embarrassed that our economy distorts the motivations for dealing with these -- pharmaceutical companies make bigger profits "treating" diseases than curing them. And don't get me started with medical insurance, or the FDA approval process.

On to happier topics. At my job we are starting to use Flask for some things, which is a sort of simplified Django. Neat stuff. And we're tinkering with Mongo and Redis, both of them fascinating NoSQL databases. Very very cool. I need to think of some interesting home projects for these.

Sunday, July 31, 2011

Molecule construction and visualization website

For a couple of months now, I've been at work on a website for constructing and visualizing molecules. In-browser molecular dynamics are done with a molecular mechanics modeler based on Norman Allinger's MM2 as described in Eric Drexler's Nanosystems. This is the same set of mathematics I used in an earlier effort in the same vein called NanoCAD in 1997. Unfortunately my knowledge of chemistry and molecular modeling hasn't grown very quickly in that time. I know a bit more from my time with Nanorex, where molecular modeling was mixed with gadgets to supply external forces ("jigs" in the parlance of our program), an idea that I believe is crucial to nanotechnology design software and also to scaling molecular simulations to much larger scales. I hope to use the code from this website as a starting point for working in that area.

I have expenses like everybody else, and I'm trying to think of ways to use this website to make a little money without tarnishing its educational potential or scientific credibility. I want the website to be readily available for use in schools and universities. If I end up putting ads on the website, I hope to make them tastefully small and out-of-the-way. I've noticed that the HTML5 canvas I'm relying upon for graphics doesn't work on iPhones, iPads, or Android phones, so there's an opportunity to sell mobile apps for those platforms, using their native graphics canvases. I'm very open to ideas to bring in a little revenue without being tacky. I've put a lot of work into this, and plan a lot more.

Longer term plans include adding jigs as discussed above, maybe an interface for a force-feedback joystick so that you can find out what Brownian motion feels like, using the website to get access to much better and faster simulators, and storing your own private library of molecules. This stuff started out as Java code with the website's JavaScript being co-developed, but at some point the JavaScript development took off and I didn't make time to keep the Java up to date. So I need to do that, and then the JAR file can become a useful computational chemistry tool.

Tuesday, July 26, 2011

Molecular dynamics and force fields

In the web's early days, I wrote a Java applet to do a little bit of molecular modeling in your web browser. I had picked up a copy of Eric Drexler's book Nanosystems, read the section on MM2, and understood it well enough to implement it in code. That was a lot of fun to work on, and as I've watched web technology progress, I've occasionally thought about taking another stab at it. I am now in the process of doing that, as some parts have gotten easier, others have gotten faster, and some have gotten just plain interesting. JavaScript, despite a few warts, is charming, and widespread deployment of HTML5 is a big help too.

Molecular modeling approaches like MM2 generally work by computing potential energy as a function of the relative positions of atoms within a molecule. This is done by decomposing the potential energy into a series of terms, relating to the lengths of chemical bonds, the angles between bonds that share a single atom, or the dihedral angle between two bonds linked by a third bond. These terms are parameterized based on the elements and hybridizations of the atoms involved. Force contributions of these terms can be computed independently and summed together to find the forces acting on each of the atoms. This is necessarily a simplification, and more accurate approaches exist involving solutions to the Schrodinger wave equation describing probabilistic locations of electrons and nuclei. But the simple mechanical approach is adequate for getting a sense of the molecule's general shape and how it perturbs over familiar temperature ranges.

For a potential energy function E(p) relating to some geometric parameter p, we can use the chain rule to get the force on an atom at position (x,y,z):
(fx, fy, fz) = -E'(p) (∂p/∂x, ∂p/∂y, ∂p/∂z)
where ∂p/∂x is the notation for the partial derivative of p in the x direction. The forms of the potential energy functions are pretty straightforward (1, 2), and taking their derivatives is not difficult. The remaining trick is to determine the partial derivatives of parameters with respect to x, y, and z. Let's consider a simple case where the parameter is the distance between two atoms at positions (ux, uy, uz) and (vx, vy, vz). Then the distance is r where
r2 = (ux - vx)2 + (uy - vy)2 + (uz - vz)2
and taking partial differentials with respect to ux yields
2r ∂r = 2 (ux - vx) ∂ux

∂r/∂ux = (ux - vx) / r
The same operations with uy and uz tell us that the force vector acting on the first atom is f = -E'(r) r / r where the direction of r is from the v atom to the u atom, and the inverse for the second atom. Here, boldface denotes a vector quantity.

It's usually easy to apply some geometric intuition and determine a unit vector in the direction of greatest change for a parameter p (the gradient of p with respect to a particular atom's position). For an angle θ involving three atoms at positions u, v and w, with v being the vertex, the gradient for u lies in the plane and is perpendicular to (u - v), pointing away from the third atom. Then the vector (∂θ/∂x, ∂θ/∂y, ∂θ/∂z) is in the same direction as the gradient unit vector, and it's necessary only to determine a scaling factor. That can be obtained by doing a little trigonometry to determine that a teeny move of distance δ in that direction will produce a parameter change , and then the magnitude of the force is -E'(θ) dθ/δ.

Sunday, June 19, 2011

Thinking about going solar

I answered an ad by an outfit called One Block Off the Grid (1BOG) which organizes the installation of solar panels on people's roofs. When I say "organize", I mean not only that they take care of various complex logistical issues including lining up a NABCEP-certified installer, but also that they try to consolidate system purchases to bring costs down. Over the past couple of years this has become big business in the U.S. because of state and federal government incentives encouraging installation.

The 1BOG folks sent me a proposal with numbers in it, and I have 30 days to make a decision during which the proposed price is guaranteed. I also spoke briefly with SolarFlair, a similar outfit here in my town, that does the same sort of purchase consolidation and does the installation themselves. Since my 30 days is nearly expired, I'm hoping to drop into the SolarFlair office some time this week and talk numbers with them.

In my own state of Massachusetts, the situation is that people with solar panels produce SRECs (wikipedia, explanatory video) worth around $500 each time the solar panels produce a megawatt-hour.
Massachusetts' renewables portfolio standard (RPS) requires each regulated electricity supplier/provider serving retail customers in the state to include in the electricity it sells 15% qualifying renewables by December 31, 2020... Solar Renewable Energy Certificates (SRECs) represent the renewable attributes of solar generation, bundled in minimum denominations of one megawatt-hour (MWh) of production. Massachusetts' Solar Carve-Out provides a means for SRECs to be created and verified, and allows electric suppliers to buy these certificates in order to meet their solar RPS requirements. All electric suppliers must use SRECs to demonstrate compliance with the RPS. The price of SRECs is determined primarily by market availability, although the DOER has created a certain amount of market stability by establishing a state Solar Credit Clearinghouse Auction (where prices are fixed at $300/MWh), as well as the Solar Alternative Compliance Payment (SACP) for the state RPS (set at $550/MWh for 2011). The Solar Credit Clearinghouse will only be utilized if or when SREC generators cannot sell their SRECs on the open market; the fixed price of $300/MWh effectively acts as price floor. The SACP, on the other hand acts, acts as a ceiling on the value of SRECs because it is the per-MWh payment that electricity suppliers must make if they fail to obtain enough SRECs to cover their RPS obligation.
There is a federal tax credit of 30% on the cost of installation. I don't know if that's factored into the prices I've been quoted, and maybe I'd need to pay that myself upfront until I get the following year's federal tax rebate.

The 1BOG proposal offers options either to lease the system from 1BOG, or to pay for it outright at a cost of about $25K. I went to the credit union and applied for a 5-year fixed rate $25K home equity loan, with monthly payments of about $450. 1BOG proposes a system to create about 5.5 kW peak, and they are guessing that averages out to about 700 watts continuous, which is about 6 megawatts per year, for a yearly SREC income of $3200. The systems saves me about $100 per month on the electric bill, and when all the dust settles, my monthly expense is about the same as it is currently.

Five years later, the loan is paid off, the solar panels are my property free and clear, SREC income is reduced but not zero, and my electric bill is still substantially reduced or absent. And I will have set a good example for friends and neighbors that one can reduce one's carbon footprint without unreasonable financial hardship.

Monday, June 13, 2011

A somewhat half-baked embedded OS idea

I mentioned in my previous post that I had spent some time porting FreeRTOS to the SAM7 architecture before realizing my purpose was better served by looking for an existing port. But in the process I gave some thought to what kind of alternative to FreeRTOS I might cook up, if somehow the porting exercise didn't go well. I haven't thought through every detail completely, and I wouldn't trust myself to anticipate every issue until I'd actually coded the thing, which I haven't yet. My sketchy design is based on two ideas.

An aspect of JavaScript that fascinates me is that everything runs in a single thread (an idea nicely described here). Each function is the sole owner of the processor (at least as concerns the JavaScript world) until it completes, and there will be no unexpected modification of variables or data structures. No locks or mutexes or semaphores or "synchronized" keywords, no threading headaches. The price of this simplicity is that functions often are event handlers and must be written to do their work quickly and get out of the way so other events can be handled.

The second idea is something I've seen when coding applications for both Android and iOS. Communication between threads is carefully controlled. Slow operations are begun by event handlers in a UI thread, and when the work is done, another handler runs in the UI thread, supplied with any relevant results from the slow operation. Within the handler thread, JavaScript's protocol of running only one handler at a time to completion is observed. Where in Android one would invoke a Runnable object using a call, I would be inclined to create a postEvent(event,arg) function since I'd plan on doing things in C, and where presumably "arg" is pointing to a struct containing whatever information needs to be retrieved from the slow operation. That way, there is never a point in time where the slow operation and the completion handler are running concurrently, and again there is no need for arbitration of data access.

Flinging structs around requires malloc and free. That worries me a little because fragmentation, memory leaks, and low memory are all likely to be more troublesome on an embedded microcontroller than a desktop computer, and I'm not a memory allocation guru. Maybe there's some way to avoid memory allocation altogether, or maybe it will be less of a problem than I fear.

The general idea would be to implement a single handler thread and a fixed pool of worker threads. Worker threads idle until assigned a task; a handler can post a task to the worker thread pool where it will be picked up as soon as a worker thread is available. There might be an event fired when the number of available worker threads went from zero to non-zero; I haven't decided yet whether that's useful.

There would be events for various hardware stimuli: pushbuttons pressed, UART character received, Ethernet byte received, timer gone off, things like that. There would also be user-definable events which would include task completions. There would be some simple way to associate handler functions with events.

That's basically as far as I've gotten with it. Obviously there's a good deal left to do, and I won't really feel good about it until I see it doing something interesting and useful on a real microcontroller.

Of microcontrollers and operating systems

In recent weeks I've put some effort into working with a Beagleboard running Angstom Linux. The Beagleboard has an OMAP3530 processor, a ridiculously over-powered thing. It's cool to be running Linux on something you can attack with a soldering iron. But as I looked at my intended application and the price of the Beagleboard, and the hoops they jump through to manufacture the Beagleboard, I started to wonder if more conventional weapons might be sufficient to win the day.

I'd blogged in the past about the AT91SAM7, another family of ARM-based chips that are a little less over-powered, so I wondered, would they work for this? My first thought was to use Angstrom Linux on the SAM7. I found that nobody had done it, and digging deeper to find out why they hadn't, I was reminded that Linux requires a MMU and the SAM7 doesn't have one. Neither of these was a surprising piece of information, and I was probably dimly aware of both, but had never consciously connected them.

The reason Linux needs an MMU is because it runs multiple processes in separate memory spaces, so that one process can't crash another by overwriting its memory. This requires remapping from virtual memory addresses to physical addresses. That's most of what an MMU does.

It's shameful to admit, but I had unthinkingly assumed that 32-bit processors would necessarily run something like Linux, merely by virtue of being 32-bit processors. This was the result of having grown up with 8-bit processors and thinking of 32-bit processors as "big" and "complicated" and a little "scary". They are in fact all those things, but we still need to keep our wits in their presence.

Casting about for an operating system that might be more SAM7-friendly, I came across FreeRTOS. I started puttering around with a FreeRTOS port for the SAM7, and after banging on that a while it crossed my mind to think that there might be some other ARM7 chip for which a FreeRTOS port already existed so I wouldn't have to do the port myself. A little investigation in this direction led me to the LPC1768 (overview, datasheet, user's manual, Digikey listing) an inexpensive ARM7 chip with lots of flash and RAM, an Ethernet controller, USB controllers for both host mode and device mode, buckets and buckets of GPIO pins, and a comfortably higher number of MIPS than the SAM7 family. The LPC1768 has an ARM Cortex M3 core (overview, user's guide).

So what are our hardware development options here? Sparkfun provides a nice little board for only $50. It has tons of pins, sanely spaced at 0.1", and a JTAG connector on one end and a mini-USB on the other. It does require a power supply but that's not unreasonable. It has two pushbuttons (one a reset) and an LED. While I heartily encourage anybody to buy this board, I ended up buying a different board (which, time will tell, I may regret) because it was available on eBay.

I'm hoping to see the board in about a week, and I'll try to get FreeRTOS running on it with reasonable haste. Hopefully it will all work out nicely and I'll get to do a lot of blogging about it. The LPC1768 is really an interesting chip with a lot of on-chip peripherals, and I'd expect that would be a good amount of fun.

Friday, May 06, 2011

Beagleboard, OMAP, and Angstrom

I've been doing a lot lately at work with the BeagleBoard, shown at right. It uses an OMAP processor from Texas Instruments. The OMAP family is ARM-based and includes a DSP core, along with an intimidatingly rich set of on-chip peripherals. The OMAP is built with a package-on-package arrangement so that the RAM die sits right on top of the processor die.

The board typically costs about $150 in the United States. You'll need a few things to work with it: an SD card, a USB adapter to write the SD card, a USB-to-serial adapter to communicate with the board, a 5-volt power supply, and later (maybe sooner) you'll want a USB hub with a RJ-45 ethernet jack.

A minimal setup is shown at right. This is just enough to connect to the board over a serial port (115.2 kbaud, 8N1, no flow control) and verify that you get a working Linux shell. The Angstrom Linux distribution has a bit of a learning curve but it seems well thought out.

I'm thinking of trying Angstrom on one of the AT91SAM7S boards from Sparkfun when I get a little spare time. I think that would work, and it would really rock to see full-blown Linux running on a $36 board. I don't know how I'd handle networking in that kind of situation, though.

Update: I am reminded that the SAM7S lacks an MMU so it can't run Angstrom. There is a different Linux distribution called uCLinux (see that would work, maybe I'll try that some day.

Friday, April 22, 2011

Chinese government hackers may be attacking Amazon's cloud service?

A friend recently received the following email from the website which is posting a petition for the release of artist Ai Weiwei, a critic of some of the policies of the Chinese government. In about the last 24 hours there have been a lot of attacks on Amazon's EC2 cloud service, bringing down and a lot of other unrelated websites. There is conjecture that these attacks are from hackers working for the Chinese government. Here's the email my friend received:
Dear [friend of Will],

The petition demanding the release of Chinese artist Ai Weiwei has nearly 100,000 signatures.

Here's how we know it's really gotten Beijing’s attention: For the past three days, the website has been repeatedly targeted by cyber attacks coming from China that aim to bring our site down, which would keep people from signing this petition.

Our engineers are working around the clock to fend off the attacks and, for now, the petition is still up.

We need to let the Chinese government know that illegal tactics from within its borders won't stop the mounting pressure on it to release Weiwei. If you haven't already, please join nearly 100,000 members and add your name to the petition now:

To recap: Acclaimed dissident artist Ai Weiwei -- who helped design the famed “Bird’s Nest” stadium for China’s Olympics -- was arrested on April 3rd by Chinese security forces at the Beijing airport. His office and studio have been ransacked, and no one has heard from him since.

The international art community banded together, demanding his release -- and the directors of more than twenty leading museums (including the Tate Modern, Museum of Modern Art, and the Guggenheim) started a petition on that has garnered worldwide attention, including in the New York Times, LA Times, and Guardian.

The campaign has helped to give rise to an international outcry. Political leaders around the world are calling for Weiwei's release and activists have organized peaceful protests at Chinese embassies and consulates.

Though China is desperate to silence its critics, the pressure to free Weiwei continues to grow. You can help by signing the petition now:

Autocratic governments know that the internet is a democratizing force, and they'll do everything they can to suppress online activism. Know that we stand with you for change, and that we will continue to fight to make sure your voice can be heard.

- Patrick and the team

P.S. Due to these repeated attacks, our site may be slower than usual or unavailable at times over the next few days. Thanks for your patience.
If it's true that the EC2 outage is the work of Chinese government hackers, it's a little scary. It means they are capable (like Al Qaeda) of attacking assets on American soil.

It might have been a stupid move, becuase they inconvenienced some of the better-funded Internet companies in America, which have access to some of the best online forensics experts in the world, so there's a good chance the hackers will be identified. Maybe future attacks of this sort can be prevented.

Interesting times. We can only hope that Ai Weiwei is free soon and able to speak freely.

Monday, March 21, 2011

March 2011 trip to Shanghai, Suzhou

Last week I was in Suzhou, a city a little west of Shanghai, and took some photos. Very interesting place with a lot of rather ancient history. I liked very much the Humble Administrator's Garden. It's quite large, with several small buildings and waterways and paths, and very pretty as you can see here.

Suzhou is a very pleasant place. I felt quite safe walking around after dark. There are lots of little outdoor markets. I stopped at one to get some squid on a stick, which was spicy and tasty. My photo of the squid-on-a-stick guy is unfortunately a little blurry.
I'm still kind of tired with jet lag. When my energy level is a little higher I will add more stuff to this. Generally it left me with a very positive impression of mainland China, which was a surprise as I'd been told to expect it to be a bit backward culturally. We were there for electronics manufacturing and there was certainly plenty of that, and plenty of heavy industry in the Shanghai area. Lots of construction, lots of big cranes all over the place.

Wednesday, March 09, 2011

AT91SAM7S and Android help you bang bits

There are plenty of test instruments (oscilloscopes, logic analyzers, spectrum analyzers, etc) where you plug some hardware into your laptop's USB port, and the laptop screen shows a display that would have appeared on a cathode-ray tube in decades past. It's very cool that we can do this, and these USB instruments are much more affordable (and much much easier to carry) than the old-school stuff that I grew up with.

The BluetoothBitBang is a gadget that comprises two boards from Sparkfun Electronics. One is a AT91SAM7S-64 header board, the other is a Bluetooth serial interface. You can see there are also some AA batteries in there to power the thing. This connects over Bluetooth to your phone, running a free app available on the Android Market. You can use buttons on your phone's screen to set or clear six output bits, and you can read six input bits. The two boards cost $71, and if you're willing to do some fine soldering and use the bare version of the Bluetooth module, you can knock off twenty bucks. If I'm energetic, maybe I'll see about putting together some kind of significantly cost-reduced version. That might depend on the level of interest I see in the thing. I've posted a Wikipedia page with a lot more information, including the schematic of how the boards are wired up.

The SAM7 firmware and the Android app source code are both publicly available on Github. I'm an Android fan, but the Bluetooth protocol for talking to the board is quite simple and if anybody is interested in writing an iPhone or BlackBerry app for the thing, I'll be happy to provide some support to make that relatively easy.

I think this whole thing gets a lot more interesting when (1) you move from a phone to an Android tablet, which will be cost-effective as tablets flood the market over the next year or two, and (2) start building much more sophisticated data acquisition front-ends. This is just about the simplest acquisition hardware I could imagine that would still be worth the effort of building and debugging it, but no reason one couldn't do a Bluetooth-connected oscilloscope or logic analyzer.

Tuesday, March 01, 2011

Sparkfun's Bluetooth serial-port board

This was preparation for the project in the next post.

I've been tinkering with the BTM-182 Bluetooth serial port module, available from Sparkfun as either a raw module or a convenient breakout board. I've set the baud rate to 115.2 kbaud and connected it to a USB serial port (appearing as /dev/ttyUSB0 on my Linux netbook) and getting power from a USBMOD4 board from Hobby Engineering, whose only purpose here is to provide 3.3 volts. The serial port uses a RS-232 level shifter from Sparkfun.

I wrote some Python code that runs on the Linux netbook. It opens the serial port and provides a teeny calculator-like command interpreter to anybody connecting over the Bluetooth serial connection offered by the BTM-182. Currently I'm using CoolTerm running on a Macbook for that, pairing with the "Serial Adaptor" device using PIN "1234".

Using the calculator-over-Bluetooth looks like this:
Good morning
multiply 3 4 5
add 6 8 12
Later I'll replace the netbook with a AT91SAM7 microcontroller board, also running a little command interpreter, and use the Bluetooth connection to talk to my Android phone. The next step is to hang some analog data acquisition hardware off the SAM7 and make a low-speed oscilloscope, displaying waveforms on the phone.

Friday, February 25, 2011

Will tries interval training

Lately I've been doing some high-intensity interval training, where "high" takes into account that I'm a baby boomer with a desk job. If you have my sort of Homer-Simpson-esque physique, then start with things you can do without injury, not what the 18-year-old neighbor can do without injury. ObDisclaimer: Talk to your doctor before beginning any exercise regimen.

Studies (1, 2) conclude that brief interval training periods two or three times per week, totaling just a few minutes of high-intensity exercise per session, can produce benefits similar to those from tedious 90- to 120-minute aerobic Jane Fonda workouts. The clearly measurable part appears to be that you can bump up the oxygen usage of your muscles, which means that your muscle mass has increased as has the number of mitochondria. Interval training also causes your muscles to continue burning extra calories for several hours following your workout (1, 2, 3). If I leave my heart-rate monitor on, I see that my heart rate remains mildly elevated long after I've stopped.

After about a month of doing this, I haven't seen any visible shrinkage of my midsection, but I've definitely got better stamina. I have a much easier time climbing stairs or getting up from sitting on the floor. All my exercise has been lower body, to take advantage of the larger muscles, but I'm pretty sure I've gained strength in my upper body as well.

I think I would benefit faster if I were more careful with my diet. I keep thinking about cutting back on carbs, maybe I'll actually do that. Silly to put in the trouble to exercise and not add the piece that would actually allow me to lose some body fat. Sillier still to regard interval training as a license to eat donuts.

Friday, February 18, 2011

Random punditry regarding IBM's Watson

I followed with considerable interest this week the game show Jeopardy! where one of the contestants was an artificial intelligence built by IBM called Watson.

Ordinarily I would try to offer some unique insight of my own about Watson. I would be tempted to acknowledge Ken Jennings' rephrasing of the now-ubiquitous Simpsons quote, "I, for one, welcome our new XYZ overlords". And I'd give my thoughts about what problems of modern society might be effectively addressed by this new technology, possibly in economics, medicine, or social policy.

But so many large buckets of ink have already been poured over the topic of Watson that I think I'll kick back and let the harder-working pundits and bloggers have this one. So let's get started.

An online publication called Washington Technology, whose business is to ensure that Beltway contractors know just enough 1337speak to get by, mentions that Watson will now be working with some medical schools, presumably to suck their knowledge into its database. The original source for that information appears to be an AP news story. Then it will absorb speech recognition technology from Nuance, Inc who had previously absorbed Dragon Systems. This will address the problem Watson faced during game play that it could only receive queries as electronic text messages.

Not much insight from EETimes, alas. They talk about a couple of pedestrian applications of data mining (basically what Netflix or Amazon does all day) in medical diagnosis where, like Watson's possible Jeopardy answers, each is assigned a confidence level, and in... wait for it... identifying patterns in shopping behavior, like the card readers at my local grocery store. Gee, that sounds world-transforming.

MSNBC talks about the same stuff Washington Technology talked about, and adds the data mining angle, this time playing Whack-a-Terrorist with license plates, credit card transactions, Internet activity, flight manifests, phone records, bank records, blah blah blah, every dystopian movie you've seen since 1993.

That appears to cover 99% of the recent writings about Watson. A little disappointing. Maybe I'll need to come up with something myself after all. Hmm. Maybe Watson's next skill set should be online punditry.

Saturday, February 12, 2011

Watson competes on Jeopardy

Watson is a computer developed by IBM researchers with the goal of competing on the game show Jeopardy. Watson's appearance on Jeopardy is in only two days, during which it will compete against the planet's two best human Jeopardy players, Ken Jennings and Brad Rutter. Watson will be appearing on Monday, Tuesday and Wednesday evenings.

This is a publicity event for IBM in the same spirit as the 1997 six-game chess match in which Deep Blue defeated Garry Kasparov. But this is much more important. Deep Blue's technology was applicable only to chess and other deterministic games, amounting to a deep search of the tree of possible future moves.

Watson uses a much broader range of technologies in natural language processing, data mining, machine learning, and resolving ambiguities of communication. It is much likelier that work done on Watson will be applicable to really important problems in medicine, economics, foreign policy, and other areas where there is a significant opportunity to raise the quality of human life.

I don't ordinarily go around recommending that people watch a particular television program, but I'll make an exception here. I'll make this easy: go to Jeopardy's When-to-Watch page, click on your state, and see history unfold. As if Egypt wasn't enough history unfolding for the month of February. If you're in the Boston area, Jeopardy is at 7:30 PM on WBZ (channel 4).

Thursday, February 03, 2011

Android app: a timer for Sprint 8 workouts

I recently learned about an interesting exercise technique called Sprint 8, promoted by a guy named Phil Campbell, due to my sister's interest in Joseph Mercola, a doctor who took an interest in Sprint 8. The idea is pretty simple. Pick a favorite exercise, maybe a stairmaster or a stationary bike, and do eight sprints in the following way. Remember to consult your physician before starting any exercise program.
  • Do two or three minutes of warm-up, nothing too strenuous.
  • Push yourself for 30 seconds. Work as hard as you can without risk of injury. This is a "sprint".
  • For 90 seconds to 2 minutes, move to a slower easier pace. Catch your breath. This is called "active resting".
  • Do a second 30-second sprint, followed by another 90-to-120-second active rest.
  • Repeat until you've done a total of eight sprints.
The Android app is a timer for doing Sprint 8, and the source code is posted on Github. If you're set up for Android app development, feel free to compile it and try it on your Android phone. The app is now available in the Android Market.

There's a lot of exercise physiology knowledge to Sprint 8 that, in all honesty, I haven't yet studied. Maybe I will in future, and possibly blog about it. But I do know that after just a couple of short Sprint 8 workouts I feel really good. My back pain is way down and I get less winded when I climb a flight of stairs. Sprint 8 workouts are claimed to produce human growth hormone (the stuff outlawed in Olympic and professional sports because it gives athletes an unfair competitive advantage) which appears to have anti-aging effects. Also see "interval training", believed to work well for fat loss.