Thursday, November 19, 2009

Strike that....substitute silence.

Dear all,

I have made a mistake, and perhaps led you all astray.

It is true that in the Axopatch, the signal ground is not connected to the power ground.

But, it turns out that it's very difficult to isolate power ground from signal ground, without essentially breaking every BNC connection between the amplifier and the digitizer. And when I did that, I got a good amount of hum pickup from those now unshielded wires. At times you could find a position where the pickup was minimal, but it wasn't easy. And figure that you've got at least 3 connections at a minimum (signal output, analog input, and gain), that all equals one big pain in the butt.

This advice originated when the digitizers were the old Digidata 1200. That digitizer is specifically designed to isolate the signal ground from the power ground (most important to isolate it from the computer, whose fast switching power supply is a huge noisemaker). That all changed in the 1300 line of Digidatas.

However, it does seem that the simple act of using the grounded to non grounded plug adapter on the amplifier, does dramatically reduce the RMS noise level. Why it exactly does this, when there's still a connection between signal ground and power ground, I am not clear on.

Nevertheless, I am available to come and lay my hands on your setups, to reduce the noise. Cause it seems I'm back to that level of rationality for this whole process.

Gotta bring this up with the big guy soon.

Wednesday, November 18, 2009

Geek alert

I might just be in productivity nerd heaven:

Liquid Planner added task timers!

Now I can reach new levels of accuracy in tracking blogly time wasting.

Tuesday, November 17, 2009

A Tale of Two Grounds

It was the best of rigs, it was the worst of rigs,
It was the age of 0.07 pA RMS, it was the age of 0.5 pA RMS, ±5 pA p-p at 60 Hz,
It was the epoch of separating signal ground from power ground, it was the epoch of connecting all power grounds to the signal ground,
It was one really annoying run on sentence written by some old white dude, it was a kick ass blog post written by a middle aged white d00d.
There's one thing I've learned here as postdoc that I've actually never seen discussed in other places. When I first heard it, I did a "Watch you talkin' about" head turn, thinking it was crazy.

That thing is, to get really good noise on your patch clamp rig (good meaning low here), you can make 2 grounds. One ground connects everything that is physically close to the headstage. This gets connected to the signal ground plug. The second ground connects everything else. That gets connected to power ground.

Figure 1: Nat's newly refurbished set up (which is tight if I do say so myself). Axopatch 200B and Sutter MP-285. Green As indicate things attached to signal ground. Magenta Bs are connected to power ground.

So what goes on the signal ground?
  • Microscope
  • chamber
  • condenser
  • bent piece of metal that can be put right in front of headstage/chamber to further shield (not shown)
That all gets connected to the gold pin at the back of the headstage (or equivalently at the signal ground input on the back of the amplifier). Note that you'll need to break the power ground connection (with a 3 to 2 adaptor). Doing this will get you most of the way there.

What goes on power ground?
  • Faraday cage
  • air table surface
  • manipulator
This is connected to power ground (either attach it to an exposed copper pipe or to power ground through the case of the manipulator). Doing this will get you to the super low RMS values specified in teh Axon manual.

Now I no longer fear denoiseing the setup.

The only tricky things I've run into are that most of the BNC inputs on the Digidata are connected to power ground. So usually I have to break the ground connection to connect the gain value output from the amplifier to the digitizer input (I've never had the amplifier signal output to digitizer analog in cause this).

I just turned on my amplifier, and with a pipette holder on, the PATCH mode RMS noise is 0.099, and WHOLE-CELL mode is 0.48. SHU-WEET!



Thursday, October 29, 2009

What have I, what have I, what have I done to deserve this?

When your electrophysiology rig has only a single BNC cable connecting the patch clamp output to the digitizer, you know things aren't good.

Especially when there's still a little 60 Hz ripple visible.

And when you still haven't solved the periodic pipette vibration that makes getting seals hard.

[Then after you write some schlock like this, you worry about what PLS thinks about it.]

In other news, I have discovered a couple things:

When there is 0.5 pA RMS noise on a well shielded headstage, cleaning where the holder attaches can do wonders. Isopropanol + canned air = 0.05 pA RMS.

When ultrasonic cleaners are used correctly, they kick total ass. If you have one, try the glass slide test. HOLYMOLYCOWTHATISCOOLANDSADLYMIGHTBETHEWEEK'SHIGHLIGHT!

Wednesday, October 28, 2009

InaDWriMo2009 - do it, do it now!

As if it weren't patently obvious, I'm still swamped under the combination of multiple projects, multiple collaborations, and multiple offspring. I just haven't hit upon a way to incorporate much blogging.

One of the other things languishing is a couple of projects in different stages of writing. One is a manuscript revision that has been in periodic hibernation since I left my thesis lab. (Ok, it is embarrassing to write that. The reasons for such a state might be both understandable and inexcusable, but at this point, who cares? It's time to get it done). This requires a substantial rewrite of the Discussion section, as well as the Introduction. This also requires some time with the literature, which I have been only peripherally following once this project came off the front burner.

Another writing project is a draft of a short manuscript, on some old results that started fast and then got stuck in neutral. While this project isn't completed, I think I know enough of the general story to at least start a draft. Writing that will make it much more obvious what needs to be done in order to publish it.

To help motivate me, I've decided to join up with the InaDWriMo2009 that Dr. Brazen Hussy is hosting. It's derived from a dissertation specific "fork" of the NaNoWriMo, that has been extended to all sorts of academic writing projects.

I figure that combined, the both of these will require something on the order of 5000 words. My goal is to complete them during the month of November. Watch the side bar for progress.

And, who else is in? The motivation is contagious. Sorta like H1N1, but with mental pain rather than physical.


Monday, August 31, 2009

Shallow thoughts

Sometimes I feel like experiments are performance art with no audience.

Friday, August 7, 2009

Blogrolling, Junction Potential Style!

I recently started following Jerry Coyne's blog over at "Why Evolution Is True"and I wanted to highlight it in case it flew under other folks's radars. Jerry is a prominent evolutionary biologist, and is well known for his critiques of intelligent design and other creationist gobbledygook. His essay, "The Case Against Intelligent Design: The Faith that Dare Not Speak Its Name" (here) is required reading for anyone interested in the subject.

Today's issue of Science has Jerry's review (sorry, behind paywall) of Chris Mooney and Sheril Kirschenbaum's book Unscientific America. It's definitely worth a read (check back at WEIT, as he might be able to post more of the review there), though a little more shrillness would help. ;)

Now, I'll admit I'm somewhat partial here, seeing as Jerry teaches at the University of Chicago (oh, dear alma mater). In fact he taught the required evolutionary biology class back in 1995 when I was but a larval biologist. Sadly though, at the time I was a fairly uninspired student of the subject, making me wish I could do it over again now. Maybe in another life.

Thursday, August 6, 2009

Oh my god, I'm so sorry I forgot...

...our blogoversary.

However, there's no doubt that this is much better than forgetting my wedding anniversary. I suppose, back in mid-June, that I should have connected the sighs of freshly minted M.D.s and Ph.D.s with the start of my blogging. Instead all I could think was, "Why the hell are these people milling around when I'm rushing to pick up my kids at daycare!"

So more than a year has passed, and though I haven't had much time to write a coherent blog post, I do have time for:

Nat's Bullet Points Highlighting the Last Year!

Accomplishments:
  • Successfully (more or less) integrated a new family member into the finely tuned machine that was Family Blair (HA!). There are posts aplenty conceived to cover this, from the innate differences between kids, to balancing the demands of two kids with work/science. Sadly, balancing the demands of two kids with work/science means writing these posts doesn't happen. Still though, the productivity hit after the second kid was MUCH more minor than after the first.
  • Finally published a good chunk of my work covering the modulation of a particular TRP channel. This project has been tortuous at times, yet through many difficulties, I stuck with it. The result is somewhat limited in its scope, but the treatment is thorough. Hopefully it will prove useful to those interested.
  • Started collaborating with another group on a cool new project. Things seem to be progressing nicely, and it has been a lot of fun.
  • Actually made some progress in finishing up my final paper from...gulp...my thesis work. Or as I like to refer to it, My Own Personal Albatross.
  • Have the prospect of starting another collaboration, that would help me complete some older preliminary work.
  • I have totally regained my calcium imaging mojo, after neglecting those long lost 1997-1999 era skills.
  • Lastly, I finally came to the realization that it's no use to fight against the core of my own personal scientific style. Sure, some parts can (and should) be bent in response to outside considerations. But to fight against the core will lead only to madness.
Failures:
  • I haven't made sufficient progress on the other side of the story to the TRPC channel regulation. The conceptual framework is basically there, and I've got tons of planned expts, but little time to carry them, or the requisite troubleshooting, out.
  • I didn't complete the manuscript of My Own Personal Albatross. Really though, I blame the daughter. I thought I had 6 more weeks!
  • I haven't done enough to move from heterologous to endogenous systems for studying TRP channels (TRPCs in particular). There are a number of reasons this is tricky, but honestly, that potential trickiness has prevented me from really trying. That's got to change.
  • I definitely haven't been able to get any sort of blogging routine down.
Overall, that's not too bad. Sure, I could always have done a lot more, like learn Esperanto or program my own electrophysiology software in COBOL, but it was a pretty good year.

So what did you accomplish this past year?

Monday, June 1, 2009

Meme-licious

The always alliterative Ambivalent Academic haz tagged me with the Cover Meme. Though I feel like I might be at the center of the maelstrom, that doesn't mean I have no time for a meme, right?

So, I present to you, the best cover. Ok, not the best, Jimi Hendrix's version of Dylan's "All Along the Watchtower" was taken, as was Johnny Cash's "Hurt." But this certainly is pretty cool. Let's set the scence for when you'd look to fire this up:

You're a parent, you've got a crabby/colicky/ornery child who just won't go to sleep despite drooping eyelids above black circles. Aha, a lullaby, that'll work. Now what to pick?

Need you ask? The answer is obvious: The lullaby renditions of Tool. Because there's nothing else I want my infant going to sleep to besides such sweet compositions as "Opiate" and "Schism." Here's hoping Volume II has Stinkfist. (Actually, I dig Tool, but c'mon.)

As for the worst cover, I have to go with Sixpence None the Richer's version of The La's "There She Goes". Why? Because other than the switch to a female lead vocal, this cover is essentially exactly the same as the original. That's a big fail in my book. Great covers reimagine the original in ways that the composer never saw, adding and extending it. Everytime I hear Sixpence's version I just think, "sheesh, I liked the version in 'So I Married an Axe Murderer' a helluva lot more."

Here it is if you must:

Tuesday, May 26, 2009

Wordly advices

If you ever find yourself writing a scientific abstract containing the terms "unitary conceptualization" or "cannot be considered equipotential in either its BLAH BLAH BLAH" please just stop.

Sunday, May 10, 2009

Happy Mother's Day!

HAPPY MOTHER'S DAY to all you moms out there!!!!!1!!1Eleventy!!1!

Here's hoping you get a little extra time for yourselves to relax and enjoy the day. Here in the Blair house we're celebrating with a spinach and pepper frittata, bacon, home fries, and mimosas made with fresh squeezed orange juice. Break out the champagne flutes!!!

Then later we'll feast on the chocolate mousse cake. YUM!

Saturday, May 2, 2009

HAPPY ANNIVERSARY WIFE!

Ten years ago today, my wife and I stood in the same church where her parents were married, where she was baptized and received first communion and was confirmed. We pledged ourselves to one another, and we've survived through residency and fellowship, graduate school, the travails of children. It hasn't always been easy, but there's definitely no other person I'd want to experience these things with.

Here's to the next decade together!

Wednesday, April 29, 2009

CAN YOU SMELL....WHAT THE ROCK U. PRESS IS COOKING?

There's a couple of interesting editorials out in from the Rockefeller University Press, which publishes the fine journals Journal of Cell Biology, Journal of Experimental Medicine, and one of my personal favs, Journal of General Physiology.

In the first, the Press's Executive Editor, Mike Rossner, discusses the practice of bundling large numbers of journals by the mega scientific publishers, and the effects on university libraries. Unsurprisingly, the current economic climate is affecting not just newspapers (do you hear that Boston Globe? That...is the sound of inevitability), but will have big impacts on science publishing. And that doesn't even take into account moves towards Open Access. Check it out here.

Here's one very interesting tidbit from the editorial:
"The Rockefeller University library subscribes to bundles of online journals from several megapublishers. For one of the bundles, the top 10% of journals garner over 85% of the hits to the bundle from users at the University. Over 40% of the journals in the bundle had no hits at all from the University in 2008!"
In the second editorial, from the May issue of JGP, Editor Edward Pugh takes on one of my personal hobby horses: Supplementary Data. Now in principle there's nothing wrong with Supplementary data; it's just currently there seem to be few standards about how they should be dealt with, both in review and archiving. Pugh clearly sets out at least JGP's position on them:
"Several pressures now call for a review of policy on Supplemental Material. One pressure comes from the growing use of such material by other journals as an omnibus substitute for publishing scientific material. Increasingly, methods, theory, and even primary results are offloaded to supplements. As a community, we need to question such practices, asking whether they are dictated by the goals of science or by financial expediency, and inquire as to the short- and long-term consequences of such practices for science."
So go check that out too. Oh, and while you're there, check out a modest little paper by Blair, Kaczmarek and Clapham. All 14 figures of it that is!

Friday, April 24, 2009

Poll crashing vanguard for the science!

Ok, we all know internet polls are far from scientific, but if nothing else, they give an inkling of the organization behind the alternative positions.

In this case, there's a poll at the LA Times blogs (here), stemming from a recent pro-research rally helpd at UC in support of animal research. The results have tilted towards the anti research poll option, so for any of those folks out there who support the responsible use of animals in research, and realize that there can be little biomedical science without it, go and vote. And tell your labmates, friends, parents and grandparents to do the same.

And if you're interested in the things that impede meaningful debate about animal research, go check out Dr. Free-Ride's recent series on the topic!

Tuesday, April 14, 2009

Phew, so glad that is over!

I know there are many of you out there, sitting around, cooling your heels, and waiting with bated breath to learn how to actually measure a junction potential. And that post is coming. It's just not this post.

This post is the final end of a big sigh that wraps up last week's Week of Not Very Much Fun.

-First, I spent the weekend before last worrying about whether I had left my bag (complete with laptop) inside our daycare center or outside in the parking garage. The first being probably ok, the second not so much. It's amazing the complete and utter lack of recollection 5 months without >4 hours of continuous sleep will do to a brain. Luckily, the bag was saved by one of the day care folks. So I didn't loose my 4 year old laptop with its broken hinge, on which I'm writing this very post to you good readers.

-Second, I was all ready to spend last week working on a data presentation for the lab on Thursday (which is fairly involved, given the size of the lab and long stretches of time between turns), as well as a 25 minute talk for the Neurobiology Dept. on Friday. But, fate intervened, as we got our paper finally accepted (good!!), but the editors asked us to turn the final changes around in 2 days (ok, doable, but starting to cut things close).

-The final straw was the girl coming down with an ear infection. She needed a couple days off daycare, which my wife and I split. She improved so quickly after the ped visit, and thank FSM for antibiotics. That basically killed the possibility of the lab data club, but I was able to get the final manuscript changes done and the talk prepared. Of course, with even less sleep than usual. I think this led to one assessment of my talk, which was "Clear, but needed more enthusiasm." Fair enough, but we're almost at the breaking point here people.

Mythbusters!

No, not those guys on TV, though they do kick ass.

What I'm referring to are some myths we live by in the lab. One of cherished one among some people is that competition inside the lab improves productivity. That's complete bullcrap. Sure, there are lots of PIs whose management styles use it, but it's simply wrong. Apparently Candid Engineer's PI feels this way. Which sucks, even if it is all too common.

Unfortunately, what these jerky PIs ignore is the actual data that suggests that competition within groups hurts overall productivity. Teresa Amabile is a Harvard business school prof who has tracked the daily work of people in high tech, chemical, and consumer products industries, and the results run completely counter to many of the preconceived notions we have about creativity. An article in Fast Company discusses the 6 Myths of Creativity. All of them are good, but here's the money quote for this discussion:

5. Competition Beats Collaboration

There's a widespread belief, particularly in the finance and high-tech industries, that internal competition fosters innovation. In our surveys, we found that creativity takes a hit when people in a work group compete instead of collaborate. The most creative teams are those that have the confidence to share and debate ideas. But when people compete for recognition, they stop sharing information. And that's destructive because nobody in an organization has all of the information required to put all the pieces of the puzzle together.

I wonder how well this observation scales beyond individual lab groups to science as Science. How much competition is good, and when does it start to be detrimental? Certainly the last sentence here can be applied to Science.

Next, Bob Sutton is a Stanford B-school professor, who wrote a book called "The No Asshole Rule" (how great is that? Plus he has a kickass blog, which has been on the Googly Reader for some time). His recent post highlights another group's paper
...using quantitative analysis to uncover patterns across large numbers of studies -- in this case, 72 studies of nearly 5000 groups. The overall findings aren't a surprise, that groups that engage in more information sharing enjoy better performance, cohesion, knowledge integration, and satisfaction with decisions made
Now sure, these aren't academic lab groups, but as all the crankass PIs out there seems to insist on saying, academic science is the real world. So for any of you PI types who buy into this line of fallacious thinking, just stop. You're a scientist, go with the data.

And if not, then do us all a favor and wear a goddamn button that says, "I'm the Michael Vick of pitting my trainees against one another." Then we'll all be fairly forewarned.

Wednesday, April 1, 2009

What is a junction potential?

It's clear to me that a number of visitors to this humble blog arrive each day via a Googly search for the term "junction potential." I can only imagine that some must be fellow electrophysiologists, perhaps in their formative larval stages, searching for more information about this important topic. So, as a service to these folks, I thought a post or two about junction potentials, would be in order. First, what is a liquid junction potential? Then, How do you measure and correct for them?

So, what is a liquid junction potential? Sure, maybe you could look in some of the Electrophysiology Bibles. Or maybe you could even hit up an electrochemistry textbook. But it's 2009, and you've got two things on your side: Google, and me. So forget that, and allow me to regale you with the story of the liquid junction potential:

Long ago, in a galaxy far far away, there was a Gedanken experiment...
Figure 1: Set up of the Gedanken. No, it ain't to scale, though aspartate is bigger than potassium. Not shown is the impermeable wall separating the two solutions. Hey, it's my Gedanken thank you very much.

And in this Gedanken experiment there was a pipette filled with your typical pseudo-intracellular solution: You know the drill, high potassium (light blue), low calcium, and an anion species that's usually not chloride. This anion could be something like methanesulfonate, gluconate, or my own personal favorite, aspartate. The main thing to note is that all of these are bigger than chloride, and bigger than potassium. Thus, they have a lower mobility, meaning they don't diffuse as quickly as the accompanying cation.

Now, what happens when we stick this pipette into a bath solution that has your typical extracellular saline, made to mimic extracellular fluid (i.e., mostly sodium chloride)? Well, the chemical gradients favor the pipette constituents diffusing into the bath, and the bath constituents diffusing into the pipette. But remember, the aspartate is big, so it doesn't diffuse as quickly as any of the other ionic species. That slower diffusion of the anion leaves a net negative charge in the pipette. This charge separation across the junction between two solutions is THE LIQUID JUNCTION POTENTIAL!!!11!!!1!


Figure 2*: The Gedaken imposed barrier is removed, and ions are diffusing down their electrochemical gradients. The bigger, slower aspartate can't keep up relative to the smaller, faster potassium, sodium and chlorides. They get left behind in the pipette, generating an excess of negative charge.

Note that a liquid junction potential would also occur if the bath solution has cations and anions with significantly different mobilities. It just turns out that sodium and chloride have pretty similar mobilities, so that their contribution to the liquid junction potential is much smaller. But if you have N-methyl-d-glucamine (NMDG) as the main cation in your pipette solution, you'll have an excess of positive charge in the pipette solution, and a corresponding slightly positive junction potential.

Next up, how to measure the liquid junction potential.

---

*-Note that these figures were created using Inkscape, a very cool and usable opensource vector
graphics drawing program (a la Illustrator). Check it out, download it, play around with it!

Tuesday, March 31, 2009

Science sometimes makes me feel like...


Let's just hope I don't die some rock and roll death. You know, the "Nat didn't show up in the lab for the big experiment, we found him having choked on vomit, but we don't know whose vomit it was, cause you can't dust for vomit." It's either that or a bizarre gardening accident, and it is spring time.

(Seriously though, I love science, right beneath the family, but sometimes it's a heartbreaker. Goddamned unrequited love.)

Friday, March 27, 2009

Sometimes, the little things are what keep you going

When you're having a crazy busy morning, and experiments are going kinda crappy (I'm at the "let's remake the solutions" stage), it can be nice to get a Friday morning email alert to 3 fresh citations to your papers! Sure, it's a little thing, but at this stage I'll take what I can get.

People are out there reading, so there must be people out there who care. Keep 'em coming folks, keep 'em coming!

By the way, what's with variability between the citation search engines? I know this has been covered before somewhere, but really.

Searching on my first paper from grad school, we get:

90 cites in ISI from the J Neurosci site.
93 from ISI itself
98 from Scopus (I got a free preview for reviewing a paper. It's pretty cool, but I haven't used it enough to say much substantive. I do like the display of citations by year, Journal, author).
102 from Google Scholar

Now, so of these I'd cut slack for (e.g., J Neurosci probably only pulls cite numbers from ISI infrequently, and who know how reliable Google Scholar is, but what's the difference between ISI and Scopus? Anyone look deeper into this?

If citations, h-indices and impact factors have traction as important metrics, shouldn't they be, oh I dunno, accurate?

Thursday, March 26, 2009

Electrophysiology isn't a technique you add to your CV; it's a state of being!

Neuropharma's comment on my last post contained something that stuck in the craw of this old electrophysiologist. Some grad student she knew thought he could waltz over and learn some electrophysiology right quick, and include it in his thesis.

Then reality struck this student, rapidly disabusing them of this conceit:
"He was shocked to discover that it would take him such a long time to learn the technique (he's starting from level 0) and said that it seemed so easy when reading it from some published paper!"

Every newb thinks that a technique they haven't mastered is easy, until they actually try it. And in fact, the bare bones mechanics of patching are pretty straightforward. I've taught a lot of novices how to patch, and by and large they can get to the point of gigaohm seals in a week or two (ok, we're talking transfected HEK cells here). Hell, I'm thinking any primate above lemurs could learn to get seals. (Not a bad idea actually; screw those automated patch systems, gimme an army of squirrel monkeys and an old warehouse, and I'll screen your chemical library right quick! It'd be like the nut shelling squirrels in Willy Wonka. And they'd literally be DrugMonkeys! LOLZ.)

But there's a huge distinction between the currents you're recording at that point, and 'good' currents. These first currents are crap, e.g. the leak is terrible, the series resistance is awful, the throughput stinks, the solution applications kill cells or generate huge noise, you've got visible 60 Hz pickup. At best they're barely interpretable. But Electophysiologicus will dole out a decent cell here, a nice recording there. That'll be just enough to keep you coming back for more, and to keep the image of good recordings in your mind's eye.

The transition between the crappy recordings of the apprentice and the regular good recordings of the master takes a long, long time, on the order of a year I'd say.
These are the dark times, where the progress is non-existent, perhaps to a greater extent than an analogous part of the curve for other technical subspecialties. Most electrophysiologists I've talked with had this time in their training, typically falling into the 2nd year of graduate school.

And yet, there's very little useful advice the masters can give their apprentices during this time, other than "keep at it". Sure, there are suggestions to try this, or don't do that. In the end though, everyone just has to put in their time, slowly perfecting each requisite skill, and evolving their own personal technique. It sucks, for sure, but it does end.

It's just not gonna end before your rotation or your last few months before you finish your thesis.

Wednesday, March 25, 2009

A pictorial presentation of pipette pulling

In the interests of both responding to Dr. A's request for pipette pulling related pics, and appeasing the apparently still irked electrophysiology gods, I present to you a brief montage of the glorious task of patch pipette fabrication!

First, what the heck are we even doing? Well, we're gonna pull a glass needle, fill it with salt solution, stick it on a plastic holder with a wire inside, maneuver it to a cell, apply a little suction, and let the magic of "seal formation" occur. Next, assuming we're doing whole cell voltage clamp, we break the seal membrane with more suction, gaining control of the voltage across the cells' remaining membrane, while recording the current (also filling the cell with our pipette solution). Sheesh, when you distill it down to two sentences, it pretty much trivializes what I spent years learning and do everyday.

The opening of the pipette tip will be ~1 µm, while the cell is on the order of ~10 µm. Obviously, if the pipette tip is too big, then we'll just suck up the entire cell. Not good. But, as the pipette tip gets smaller, the resistance between the pipette interior and the cell interior gets larger. Also not good. In fact, that causes a whole host of problems that are left as an exercise for the reader to derive (ok, just kidding. There's a series of posts reserved for this, with current working title: "Dr. RseriesLove, or, How I learned to stop worrying and love the fact that my currents are all wrong)

I start by cutting the capillary glass, by scoring it with a diamond pencil and breaking it off to the correct length (so it'll fit in my particular set up, given the headstage position, etc.).

Then I smooth the ends of the capillary glass with a bunsen burner flame, because jagged end (even how it comes from the factory) will scrape off the silver chloride on the wire that transmits the current from the ions in solution to the electrons in the amplifier circuitry (as well as tearing up the O-ring in the headstage holder).


Then we move onto the puller itself. There are many different kinds of pullers, but having been in a number of electrophysiology labs, I would say most people use pullers made by Sutter Instruments. The basic puller operation is to melt the glass capillary while pulling on either end, drawing the ends apart. Now to get a nice wide tip patch pipette, we use computer controlled application of the heat, allowing you to stop the heating a certain time after the capillary begins to pull apart. Over repeated heating/cooling cycles, you can make the perfect pipette.
Here's the puller, a P-97, and if you unscrewed the 5 screws on the font panel, you could peer in and see the brushless super quiet 92 mm fan we installed (way in the back of course, a real pain in arse to reach). The smoked plexiglass cover opens to reveal:


The business end of the puller. The circle thumbscrews clamp down on the ends of the capillary and maintain tension. The capillary feeds through the box filament, which gets hot when the puller is activated (sorry for the flash glare here). When the glass separates, we're left with a pair of pipettes, which we fire polish by bringing them close to a red hot wire (observing under the microscope). Finally, we're ready to patch!




The pipette is filled with intracellular solution, stuck in the polycarbonate holder (which has the silver wire in it), and stuck into the headstage of the amplifier. The suction tube allows you to provide postive pressure while you're approaching the cell, or negative pressure to form the seal and to breakthrough. The cells are sitting in an extracellular-like solution in the chamber, and the pipette approaches under micromanipulator control (here, a piezoelectric based Sutter MP-285), all the while watching through the microscope. In fact this pipette in this picture is making a GOhm seal on a little HEK cell. Of course, when I applied suction to break through, this cell was terribly leaky (again, Electrophysiologicus, patron of patchers, I'm like so over that hubris- could we maybe move on now?).

If you look closely, the tip of the pipette is wrapped with a thin strip of Parafilm. This helps reduce the capacitance of the pipette, but isn't nearly as time consuming or messy as using Sylgard. A requirement for setting the series resistance compensation. All of which are good topics for future posts!

Hope this was at least mildly useful to some people out there, and marginally enjoyable to others. If anything's not clear, just fire up the comments and lemme know.

Monday, March 23, 2009

We got monkeys running all over the house!

Ok, well not so much running, though those little legs are always moving! There's plenty of room to grow into it, as the young'un is still pretty mini. But she's a happy one, especially while modeling DrugMonkey schwag! So what are you waiting for? Got get yours here. Do it, do it now!

In other news, I turned 35 yesterday. It was a great day (thanks to the wife and family!), but frankly it's a crappy age. First time I actually feel old on my birthday. *sigh*

Thursday, March 19, 2009

2 grand? How about 20 bucks?

Ha!

I've been having a recent bit of difficulty with our pipette pullers in the lab. Well, pulling program parameter searching is no news to any of the l33t electrophysiologists who frequent this blog. We've all been there, and we'll all surely revisit that terrible state of being.

But not too long ago one of the pullers started making a horrible noise when switched on, as the cooling fan must have lost a ball bearing. The company service folks were in the area for the recent Biophysical meeting (how was it Dr. Samways?), and made a swing through the lab. They offered to refurbish the whole thing and replace the fan for $2000. Not a pressing issue, but apparently as the heat builds up inside the puller, it would slowly dim the display, making it hard to edit programs.

Economic times being what they are, we demurred. Which is a good thing, because when we pulled the front panel off that sucker, the fan was just a 92 mm fan like you'd have in your computer. So we got one of those, pulled out the old dead one, and slipped in the new. Voila! And while we were at it, we got rid of the decade plus layer of dust in that thing.

Still though, I miss the old P-84 I used in my thesis lab.

Tuesday, March 10, 2009

Hubris - and that ain't a circumscision in Whoville

Alas, I fear I have angered the Gods of Electophysiology with my last post. 

For now in their puckish ways, they have sentenced me to several hours fidding with the puller, in a vain sisyphusian search for something resembling a stable program. Or really anything that might give a useable pipette.

I've been burned bad. So what does one do when burned like Icarus in the tale of old?

WHY, IT'S TIME FOR MAIDEN!




(I'm digging the bass in this one!)

And it's back to the puller...

It's a beautiful....pair of sodium currents!

I was recently looking back over one of my papers from my thesis work, and came across a recording that I frankly find just beautiful. I had it in my mind to post it, and then Dr. J's post about their recent beautiful gel spurred me on to finally do it (another of the benefits of non-pseudonymity):

Figure 1: Sodium currents during action potential waveforms in nociceptive sensory neurons.

Look at that goddamn subtraction! And that outward transient current in TTX-R? Gating. Current. Sure it might be from a 6 year old paper, but it still thrills me a little. Some of this nostalgia might be related to the comparative lack of beautiful TRP channel currents I encounter these days. Oh, for a TTX of TRP channels. Oh, for closing your channels with negative potentials. *sigh*

As I commented over at Dr. J's place, beautiful data don't necessarily equal meaningful data (same with the converse), but I do think there's likely some correlation between the two. 



Tuesday, February 17, 2009

Me, cool? Not hardly!

Cool is no adjective to apply to me. But thanks for making me feel better.


NerdTests.com says I'm a Cool Nerd.  Click here to take the Nerd Test, get geeky images and jokes, and write on the nerd forum!

Thursday, February 12, 2009

The thrill of victory...

On the off chance that any of my miniscule readership does not already follow Ambivalent Academic's blog, you have to go over and check out yesterday's post over there. In it AA discusses the positive aspects of being a basic academic scientist, and I'd hazard a guess that many of us can strongly relate to those feelings. It serves as a welcome counterpoint to all the crappy and ugly aspects of the practice of science, and all the bitching and complaining that goes on in the blogosphere (and in lunchrooms, coffeebreaks, and post seminar chats between scientists in real life). Now, don't get me wrong, discussing all those negative parts are very useful and necessary. But it's also nice to sit back and reflect on the things we love about science. I find it helps stoke my passion for science, which too often nears extinguishment. And I really do love being a scientist.

Physioprof added a great comment as well, about his excitement at an old breakthrough achieved in grad school. You can feel his enthusiasm, even for an event that must be years old by now. And as he says, we're all chasing that feeling. I remember one of my own as well, which I will share. For my first project in grad school I was recording sodium and calcium currents during action potentials in nociceptors. As there's no good blocker for TTX-resistant sodium currents, I settled on using ionic substitution, replacing external sodium with NMDG. That worked well for the subtraction, but I did notice that the resulting sodium current kept increasing even as the voltage approached ENa, which obviously shouldn't happen. I kept that stuck in the craw of my brain, which chewed over it as I proceeded to look at calcium currents in other cells. (Which is how I normally let it happen; given time the craw digests whatever problem is given to it).

I can still remember sitting at my desk, looking over some other experiments, when it hit me. It was obvious that intracellular Na+ and K+ were making outward currents through unblocked TTX-R channels, and these became sizeable at depolarized potentials near the peak of the AP. In retrospect it isn't so surprising. But it wasn't so obvious to me that the outward currents would really be large enough to make a big difference, relative to the large inward currents when external sodium was high. Turns out they were. Very soon after that I figured out a way to correct my previous results, which became a figure on its own in the final paper. All in all a great feeling.

I think we all just hope that our science doles out sufficient number of these moments to keep us from totally giving up in the face of so many difficulties.

Friday, February 6, 2009

Perversity isn't always fun.

Judging scientist performance by impact factors is like judging CEO performance by short term share prices.

And both produce perverse incentives.

Let me add as a final note, if you scan people's CVs for Cell, Nature, Science papers, then you're judging by impact factor.

Thursday, February 5, 2009

First Axiom

I've been doing lots of thinking about teh Scienz lately, from the little details to the big pictures. (Yeah, ok, so maybe I'm having a scientific midlife crisis. What's it to ya?)

Sure, most of these thoughts are neither particularly well considered nor highly developed. Indeed, they're probably not very insightful either. But the best way of improving upon them is to just start writing about them, and hopefully get feedback from other people (especially fellow working scientists). One benefit of this approach is that I'm not so wedded to my thoughts that there's a huge congitive barrier to overcome if they need altering.

With this entire morass in my head, regardless of the current state of each, I've begun by thinking explicitly about the "why?" question*. Why do we do what we're doing? What's the ultimate purpose of it? In my mind, before we can judge whether a particular way of doing things is good or bad, we need to figure out the answer to the "why" question.

So let's ask the question about Science, broadly speaking. What is the purpose of our endeavor? The answer to that question forms the ultimate basis by which its attendant ethics, practices, and structures must be judged.

My answer is that Science's aim is to produce true statements about the world.

I consider this to be the first axiom. I don't even claim this phraseology as my own, as I'm sure I must have read something to this effect over at Dr. Free-Ride's (which is the place I go when thinking about All Things Philisophical). But I can't think of a better way to put it. And I would guess that essentially all scientists would agree with it. If not, I'm all ears about what is the ultimate goal of Science.

Now onto whittling down that morass!**

* - I understand that this isn't terribly insightful. Either River Tam or Arlenna brought it up some time ago (i.e. - the hazy time that existed prior to the daughter's arrival) in a discussion about authorship order issues. Furthermore, just about every project planning book out there contains something similar. Hell, it's in David Allen's Getting Things Done book, so that means about 1.18 billion people on the Earth know it.

But knowing it and doing it are two very different things. If you don't believe me, consider just about any committee meeting you've had the pleasure of attending. How many actually started with "Why?" And how much talk was really about "How"? Besides, having done exactly this in project planning of various sorts, I'm often surprised by how useful it is in producing different ways of approaching and solving problems.

** - Something about this sentence really makes me happy.

Thursday, January 29, 2009

Ruminations of the temperature sort

Last night, while "shoveling" the slushy concoction from our driveway, I came to an unexpected realization: My cold weather tolerance has returned.

We've already had some bona fide cold weather this winter, with multiple sub 20 degree F days, and some nights in the single digits. And I really haven't felt terribly cold. This stands in contrast to much of the time spent here in Massachusetts, where I was always freezing. A breath of Canadian artic air would send me running for the long underwear.

And yet, I wasn't always like this. I grew up in Connecticut, which though not particularly cold is far from tropical. Then I went to college in Chicago- Ah, I remember the day, trudging to a morning class, where the air temp was -25 deg F (-50 with wind chill). Overall, I remember the cold as being present, but no big deal.

Then, I started grad school at Stanford. Palm trees! Orange and lemon trees! Chelsea Clinton! I wore shorts EVERY day that first winter. It was the El Nino year, and it rained every fricking day, but I still wore shorts. I distinctly remember the odd looks I received from those Palo Altans. They'd be wearing their hats and gloves, and I'd be traipsing across campus, bare legs and all.

But then a funny thing happened the next winter in California. I froze my ass off. I had to wear pants all winter, and even turned up the electric baseboard heat in my on-campus apartment. WHAT? Apparently, a year of living West Coasterly obliterated any and all cold tolerance I had built up. A reversal that took about 10 years to reverse.

Which brings me to what prompted me to even post this bit of boring biographical information: Is this a real phenomenon, and if so, what's the neurobiological basis for it? Do people in different geograpical locales actually perceive temperature differently? Is the difference peripheral (i.e. sensory neurons) or later in the processing? For example, what would you see if you compared the sensory neuron activity of lifelong resident of Alaska and Florida, recording their sensory neuron activity? And how does that change?...trp channels?? trp channels?? trp channels??...And is there a sex difference in this aspect (how many of us can recount the differences between our sense of temperature versus our partners?). And how does aging impact this?

The final question is:when will Spring arrive? Cause though I don't mind the cold so much, I'm already sick of the snow.