This 21st century scientist’s life & learning.

Building on the platform. 

I’ve spent some time thinking about what I’ve built over the last few years as I have made my way out from someone that wanted to just leave the world to someone who wants to contribute in real ways, in positive ways (don’t we all?), and meaningful ways.

Coming out of the dark and into a world of wonder can be complicated. Being flat and feeling divorced from connecting to the world to being vital, more engaged, can be a scary process. I realize just how much I’ve missed out on, not going deep into any particular subject because I didn’t feel much in whatever I engaged in. I’ve written before about just what depression takes away from learning and it’s hard to describe since plenty of successful people have depression (perhaps they succeed despite it), and I can still read and write (perhaps not well, but it is something I work on) and do basic math. I feel I can learn things. But I have tended to lack an emotional connection to something that can boost learning. Depression feeds into the fixed mindset as well, rather than a growth mindset too— with constant rumination and the voice that says ‘who do you think you are? You’re nothing, no one, and don’t matter’.

Eiffel Tower under construction 1888-1889. Source: Yale Libraries.

This blog has really documented that process for me. I hope I’ve been building a platform on which to build even better and greater things. Beth Buelow an entrepreneur, coach, and introvert in her really good book talks about an image series she got of the Eiffel tower being constructed. They built the base quickly, and then progress appeared to stop for a long while before the tower was completed. During that apparently fallow time, the construction workers were doing a lot of reinforcement of the structure, adding rivets and doing the preparatory work to build the tower. Building a strong base to create what was one of the tallest structures in the world at that time that persists to this day.

I hope I’ve been building that kind of base. That I’ve gotten better in some key ways to start the next phase, to really get out into the world visibly for the world to come and see. I do need reminders of how habit change can be most effective like this from James Clear. And it helps to be reminded to surround yourself with people that help you be your best. Though I find myself overdosing on ‘Lifehacking’ lately (it can be great for ideas, but easy to overdo it or to be constantly trying new things). I’ve built up a system that kind of works, I think, that’s healthy for me. And now I need to mold it into output that helps me grow more and gets me out into the world, being mindfully productive.

And as James Clear points out, prioritizing matters, and taken further, and perhaps scarier/harder is the idea between finding the distinction between should/must and choosing the latter. And continuing to learn, grow, and retain new knowledge/experience through a system that works and is evolving. And that also means being able to make decisions more rapidly than I do now, and act on them and being guided by what is truly important to me.

What is essential? 

I’m going to write an ambition of mine: I want to be a science writer in some way, shape or form. I love transmitting knowledge between minds. It seems to drive a lot of the decisions I make. It’s something that is more important to me than the research I do now. It’s an ambition that’s scary, but also seems deep-seated. I love science. I love writing, art, and popular culture. I love learning and teaching/communicating. Maybe it’s because I’ve listened to one to many podcasts and read one to many amazing writings about science that I’ve gone out of my mind, but why do I gravitate towards those things in the first place? And how to get from where I am now to a new place? That’s not easy to answer.

Being a scientist now means having to wear a lot of hats, being seen as competent and amazing at many things that Ben Lillie (partially) listed, including having a public face to engage with non-scientists. It seems like people are expected to do more and more every year, to sacrifice our lives for our work, to produce ever more value. And whatever we do has to be quantified and standardized, even if that’s not the best or is too narrow a measure.

With the digital tools most of us have access to, we are expected to do everything ourselves, to produce more, always learn things flawlessly, and basically be perfect. And yet, that is unrealistic for any individual human. Not all of us are skilled at everything, but the 21st century world seems to demand that in an era of impatient teaching and exclusion if you’re not in the ‘in’ crowd from early on. And there is infinitely more to learn. And of course, digital tools allow for tracking of productivity more than ever.

Many circumstances can keep us from trying things that we’re truly suited to do. There’s a story Mark Twain tells (attributed to him, anyhow. I can’t find a source) talking about a man seeking the world’s greatest general only to die and go to heaven to find that a cobbler would have been the greatest if given the opportunity. Did he just live at a time with no war or was it that there was a crucial moment where he didn’t take a leap into the military life? If it’s the latter, hopefully there’s still time for me to make a leap. Maybe by not having an alternative, it’s possible.

Coding is something I am just starting to dabble in…and we’re all told it is the essential skill of the 21st century. I don’t know if that’s the case, but it certainly seems handy to any citizen of the Internet where many of us spend out time. And if not having a full understanding, at least knowing some of the theory behind the gorgeous websites we see each day is important. And it’s important to know that the people who build them are not perfect either; and often have biases/problems. And I don’t think this idea applies to just coding. To be in demand seems to mean being good at all the things and not needing a learning curve. Of course, that might be my warped perfectionist perception speaking.

A lot of science news is dedicated to reporting how we might all live better, parent better, be healthier, do more for the environment, and basically be better people if only we’d all behave, spend money, or act differently. Only that is vastly unrealistic. And the recommendations often wrong because of flawed science. Science really is the last word on nothing.

What can we get wrong?

Phil Plait, in a post on his Slate blog, wrote about response to a picture he tweeted about actresses that have a passion for science (great!). The problem comes with Mayim Bialik (w/ a Ph.D. in neuroscience) and her anti-vaccination views; which are scientifically indefensible as this NPR story on a documentary about the effects of not eradicating polio demonstrates. Keith Kloor addresses this with Dr. Oz and similar and perhaps not as dangerous are Bill Nye’s anti-GMO views; if only because Nye, an engineer, does not have as informed views about biology and doesn’t seem to be strongly anti-GMO as yet, just highly skeptical. He could change his mine yet. Bialik and Dr. Oz must know better/be more familiar with life sciences and medicine.

The process of robust science dictates that any ideas or technologies supported by science (e.g. climate science, gravity, evolution, smart phones, vaccines, current GMOS) are in fact safe, work, and that is the final word (of course, each product needs to be taken on a case-by-case basis). Selective application is not acceptable. There are areas of science that are still debated and the above ideas continue to be investigated and tested by science to test new methods of delivery, to explain parts of these ideas we don’t know the answers to yet, or to improve them in some way (or create vaccines to viruses we don’t have vaccines for as yet). And of course, scientists are never absolutely certain; we’re taught to critically examine our ideas and design experiments/seek data that challenge our ideas (that may happen less in an era of hyper-competition, tight funding).

2014-11-16 21.22.33

In today’s world, it really appears unacceptable, especially as a public figure/celebrity to say ‘I don’t know’ when pressed about some question that’s out there in the world (uncertainty being a perceived sign of weakness?! I would argue that it’s the opposite). I am not a psychologist, social scientist, or neuroscientist, only a sufferer of depression and anxiety who has learned what I can about them and write about my own solutions (some scientifically grounded, others likely less so). I’ve tried to strike a voice of not barfing rainbows magical positivity, but of grounded optimism. I routinely say that I do not know, and feel uncertain about most things and this can be paralyzing. Who would do anything given the potential repercussions of getting something wrong? Phil Plait seems to have changed his mind after hearing from fellow bloggers about Bialik’s anti-vax views. I don’t even know where her anti-vax views stem from (is it a case like Dr. Oz where his spouse seems to have opened the door to pseudoscience views?).

Some of these views may be caused by hastiness and shorthand/lack of time to think. In an era where we’re awash in information, it is impossible to be informed about everything and yet we’re also too quick to be aghast when people don’t have views or don’t know something. At best, it comes off as enthusiasm you want to impart to someone about a topic. At worst, it’s used as an identity marker to exclude people, even if they’re new enthusiasts for something you’ve been into for years…and get turned out because of newness to something and simply don’t know as much. While I agree enthusiasm only takes you so far, it’s a spark that can carry you into new and unexpected places and shouldn’t be discouraged whoever has deemed themselves a gatekeeper of a community.

There is demand to specialize and yet be a generalist at the same time. And to instantly able to learn and absorb new things. I’m willing to work hard to figure things out, but if I’m given insufficient time to learn what I need to, I’m much more likely to make a mistake (and learning time seems shorter and shorter…and unexamined learning can lead to problems). We’re all encouraged to learn how to learn, and yet that seems hugely insufficient somehow. I am nearly paranoid of missing something critical or leaving some citation out. Of course, it’s not all about what we’re informed about. It’s also true that we develop identities around shared beliefs (‘people like me have this belief, I must think that too’) that can become quite entrenched in communities in which case information alone cannot change someone’s mind, as work by Brendan Nyhan and other’s has shown.

Hard at work reflecting.
Hard at work reflecting.

It may be that I’m just worried about something I feel exists but isn’t actually as bad as it seems. However, everywhere I look, there are demands to be up on the latest everything and if not, you’re falling behind the times! Keep up or go away, you can’t compete and so shouldn’t even try. The world is complex and crazy and there is likely more awareness of that than ever. Being humble in the face of that is a virtue in my book. There is likely always more to a story. And just because we’re not always completely informed does not mean we can’t act or put our voices to an idea, but we need to listen to feedback and accept evidence contrary to what we think is going on. All of these mental gymnastics should underscore just how hard it is for scientists to come to strong theories about how the world works and when a scientific consensus is reached, it’s a big deal, and more credible than an individual report alone.

I’ve never had a good cup of instant coffee. I’m not sure that exists. Putting in the work to grind beans, put them through a quality filter, and taking the time to let it steep often makes for a better cup

Good coffee takes time.
Good coffee takes time.

I am an academic scientist right now, trying to contribute to my field in a meaningful way and not add to the noise of wrong/hasty information that’s out in the world. Patience isn’t a virtue we hear a lot about anymore. The world seems to be more about speed and getting to something first. Instant may be good for some things, but I like to think of it like sources of coffee. I’ve never had a good cup of instant coffee. I’m not sure that exists. Putting in the work to grind beans, put them through a quality filter, and taking the time to let it steep often makes for a better cup (not always). And perhaps due to my (highly) introverted side that likes reflection, writing, and learning before speaking up. And I hope any job I do hold will allow me to do just that, within reason, of course. I am determined to add value wherever I work, and I hope that the skills I gravitate towards/have developed are valued somewhere in the world.







GMO labeling.

I had a creative thought about GM labeling that might also teach people something about the evolutionary past of plants and agriculture (both of which involved extensive genetic modification of plants before what is considered GM came into being in the 1980’s:

Figure 1. Complete GM history of crop plants. (A) A made up phylogenetic tree of plants, focusing on Zea mays, with notes (small arrows). Small arrow s indicate notes/interesting evolutionary features/sarcastic notes (B) A basic lineage of modern hexaploid wheat. Purple circles indicate areas of modern species, even if there is a GM trait present. ya = years ago.

I believe something like the cartoons in Figure 1 (panel A has made up branches to the dendrogram, but the relationships are basically right. Panel B is a very basic diagram of the species’ genomes that hybridized to form hexaploid wheat) tell the full evolutionary history including any genetic modifications in modern times. If a GM label exists, it has to be more than ‘GMO’…that’s not the important part in a lot of ways; the specific modification is (as in, what trait has been conferred or taken away from the plant through that genetic modification). Heck, maybe with this ‘full’ label, even Monsanto would be in favor of labeling their products since it provides the full picture of plant modification (both the natural and some unnantural).

Some thoughts on GM & companies behind them (& why they’re so loathed by some):

—Yes, Monsanto is a large for-profit biotech company that is part of the industrial agriculture complex that basically owns the US Government and can get whatever policy they want passed (fact is, it still costs $10 million to go through the regulatory hurdles to bring one of their GM crops to market– which is why all GM crops come from large companies…they’re the only ones who can afford to go through the regulatory approval process).

—Monsanto sells a product. It is genuinely useful to some farmers in some places. In other places, less so. Mileage will vary. Now, Monsanto’s marketers would (probably) tell you that GM is a panacea, but it’s not. They’re great for some situations, but right now at least, certainly not ALL situations. But I’m sure scientists are hard at work coming up with new GM products that might genuinely help even more farmers grow higher yielding food on their land.

—A lot of the hate Monsanto gets is exactly why anyone hates a large corporation (or the government)…it’s a big entity that seems to get away with things the rest of us who are *ACTUALLY* people can’t get away with (tax shelters, anyone?).

—The optics of a large company these days are inherently terrible: They own Washington & get all the legislation they want (just see the FCC & Net Neutrality debate) and will do anything to enrich executives at the expense of other employees (& favoring share holders above all). It has little to do with what the company actually does. However, the bad optics don’t matter unless it affects the profits of said company (how many bank CEOs got canned after the financial crisis?). And of course, I’m sure this is a bit of a cartoonish picture of how things actually operate, but there is some validity to it, and that’s kind of sad that people like me have become  so jaded by ‘The Man’, for lack of a better term.

—Boycotts and actual science can work to change a company; Monsanto used to make agent orange…they don’t anymore as far as I know in part due to a changing Ag industry and in no small part due to the shift in the culture at large.

—Monsanto hasn’t done itself many favors by being so opaque about it’s technologies and what their scientists are motivated by; they’re not out to destroy the world, I genuinely think they want to leave it a better place– they have kids and families too, after all.

—Monsanto is keenly aware that if they put out a GM product that is actually dangerous or detrimental, they’re done. Finished as a company, no one will trust them ever again. So they really do extensive testing of their products before releasing them. And they’ve got a great safety record (the environmental degradation is not necessarily due to GM, but simply to agriculture itself…a very disruptive process to the environment…something that I hope they’ll work to improve, after all, no environment & we’re all kinda screwed…).

Labeling issues.

Vermont recently became the first state to mandate GM food labels. There are other proposals and this was recently in ‘The Atlantic’. It’s popular and part of why it’s popular is the simple ‘right of the consumer to know’ what they’re eating. Other states have come close to passing similar legislation. I haven’t followed these debates closely, but here are some of the issues to consider in labeling GM foods:

—What would the genetic engineering label entail? Not all GM crops are the same; scientists can put basically any gene into a plant (we do this for strictly research purposes all the time), so is labeling each modification important? (yes, it is– depending on the gene, a lot of different traits can be conferred to a plant, that is more important than the fact that it’s been genetically modified).

—What ‘counts’ as GM? Agriculture has been practiced for 10,000 years. The crops we grow now are selected varieties for traits that are good for growing a lot of food in a small area…like dwarf wheat, responsible for the Green Revolution; there are some very real genetic modifications that happened in that time…even whole genomes introduced into plants (that’s about 20,000 genes…not just the usual 1 or 2 of today’s GM). And then, of course, evolution shapes plant genomes too; is that genetic modification?

—Nature is messy. So called horizontal gene transfer (a gene passing from one organism to another that aren’t related; in other words, not a direct descendant) happens all the time. Viruses insert DNA into genomes of all kinds of life every day.  Bacteria swap genes constantly, sometimes into eukaryotes like humans and plants, and there are even examples of eukaryotic horizontal gene transfer, the neochrome gene in many ferns originated in an early land plant called a hornwort…and yet anti-GM activists aren’t torching the fern covered forest floors. Natural does not mean good, it means natural. Human-made does not mean bad (often, that’s the case, but not always), it means human-made.

Wrap up.

So does a GM label have to incorporate both modern and traditional GM? Should it include all of the genetic, natural and evolutionary history of a plant we eat? Consumers have a right to know where their food comes from, after all. My Ph.D. advisor found an op-ed once by a guy saying that if he ate a plant engineered with a human gene and ate it, it’s cannibalism. That’s a little ridiculous…after all, humans share quite a bit of DNA with plants (so that salad you had for lunch…you may be 50% cannibal), and at the molecular level, we’re all pretty similar, made of all the same stuff. We’re all breathing in and incorporating atoms of people and diseases long gone and forgotten by history…does that make us all cannibals?

it is difficult to trust a large company (I’m uncomfortable having Google know everything about me at some level), but Monsanto does, ultimately want to help farmers and they recently acquired a company that models climate/climate change to help farmers grow food in what will be an increasingly changing climate. That’s not the move of a company that doesn’t think about the future and leaving the Earth a better place (of course, they’ll make money doing it, but profit isn’t inherently evil…it just can be). I know a few scientists that work at Monsanto and I would consider working for them myself (not that they have reason to hire me— especially after I’ve been a bit harsh on them here..after all I think too much and ask too many questions). I am not an expert in everything Monsanto does (in fact, a lot of it is kind of under wraps, trade secret…which also probably doesn’t help themselves), nor farming. I am a plant scientist studying plant development







I recently re-listened to the ‘On Being’ interview with Jennifer Michael Hecht author of the book ‘Stay’ that I read about on Brain Picker. It’s all about making a non-religious based argument against suicide. And there really are reasons to stay. Be assured, your absence will be noticed. I won’t go into all the arguments why here, but it’s true.

Watching the latest episode of ‘Cosmos:ASO’ last night, Neal Tyson walks through the fact that we’re the legacy of all those organisms that struggled for survival on Earth before us. That’s one reason to stay. There are many, many others.

Last week, I casually wrote a Gchat away message talking about an important experiment I had to set up the following day. And it led to this idea for a reason to stay:

Blog Post Line.


It’s something I’ve told myself the last week or so and it’s good to remind myself that life, in part, is about keeping on trying. I am doing things now that I couldn’t have possibly done a few years ago and it’s because I stayed; there was a time I didn’t want to.

The future isn’t really written in stone, as much as scientists try to do predictive work; it only applies to rather narrowly defined experiments, nothing like life. So it’s not only saying ‘Stay’, but also to crib one idea from science: to try new things and find those that work; discard those that don’t, and to keep creating, tinkering, interacting, acting, thinking, insert favorite present participle here– we’re only here once.

There is problem within academia surrounding poor mental health of too many people in it– particularly amongst young Ph.D. career path people. The reasons vary, but the added pressures of the highest career uncertainty for Ph.D.s and postdocs now surely is  a contributing factor.

Tomorrow is an important experiment to do, find something new that might work for you and even if things don’t work out, you’ve at least fought in the arena,







I was reminded recently of a 19th century invention that actually shocked people when it was installed in Harrod’s in London. It was an escalator. And apparently they gave out shots (drams?) of brandy at the top to calm people’s nerves after riding it (do you in fact ride an escalator?). I think nothing of escalators. I use physical stairs whenever possible, but if I have no choice, escalator it is. It’s just a thing that exists. I can’t imagine being shocked by it or surprised in the least. Maybe the first time I rode one as a kid? They are fun…always wanted to slide down the railing.

At one of the first ever screenings of motion picture in a theater, a train arriving at a station, the story goes that the audience leapt out of their seats with fright as the train in the scene was moving towards the camera, and potentially therefore off the screen. I can’t imagine having a reaction like that to a movie, something obviously a projection on a screen.

Not to say I haven’t flinched at a movie when something is scary or shocking or that movies aren’t powerful and can make you react in a very real way. But my sense is that I always know it’s just a movie; nothing will leap out of the screen (except emotion and feelings that can certainly cause a physical reaction, but that’s not most movies. Or even most things we encounter these days in a civilization that is based on science and technology.

I wonder what the modern equivalent of the train pulling into a station or the escalator is- or nearly anything that was electric/mechanical that came into being in a rapidly industrializing age. Certainly when Apple introduced the iPhone, everyone’s jaws dropped; it was just a huge leap from what existed before (I know, Blackberry had been around before, but this really opened things up to the average consumer). And yet still familiar some how; it was still a phone after all (though we seem to use them less as phones as they get more powerful and connected).

I am thinking about all of this because it is hard to wrap my head around the fact that there are still crazy things out there in the world to find- to invent. Could I invent one of them? Do I look for problems to solve in that way?

Discoveries have become routine and a bit blasé. Of course there was a breakthrough, those happen every week (even if some don’t turn out to be in the end because of the hype-machine that exists today).

What could I do that would wow the world? Am I thinking too audaciously? I am a scientist after all. I feel like I should be inventing new things- or realizing the eventual potential of my current work, but I don’t just now. Maybe I need to be engaged with it more. Think outside the boxes that I’ve put myself in the last several years.

There is natural resistance to some new technologies- and certainly rules need to be in place for responsible use of anything new. These ought to be based on sound science. And of course that requires a level of scientific literacy to come up with sensible rules, for citizens to be able to grasp enough to be OK with the technology and to adopt it. I think the latter is not nearly sufficient in the US these days with large numbers of people not very engaged with the scientific process, but also denying climate change, being anti-vaccines or anti-GMO. An individual vaccine might be bad (unlikely after all the development they go through) or an individual GM modification may be bad (again, there’s heavy regulation here too), but they need to be taken on a case-by-case basis.

I’m pretty amazed by modern communication technology, even if it can be frustrating. Computers, the Internet, smart phones- all pretty awesome things.

What’s the thing that would truly blow away a modern audience? I’m not sure. But I think it exists. Or will. I’m far less certain that I’ll be a part of it, but I’d still like to try I think.

Ever on and on.


Based on a Twitter discussion with Steve Hamblin (@BehavEcology) I am a little too excited about comparing postdocs and grad students to an ant colony. Biochem Belle (@BiochemBelle) posted a few months ago about how tired the indentured servant analogy is- and how inaccurate (I’m sure others have said similar things). 

However, the ant idea has legs…six of them. 

  1. Ant colonies are collectively smart, so are Ph.D.’s and postdocs (and the scientific enterprise generally).
  2. In ant colonies individuals are expendable, so are Ph.D’s and postdocs
  3. Ant colonies are very persistent and collaborative, following other ants to food, etc…sounds like Ph.D.’s and postdocs
  4. Ant colonies are adaptable. Ph.D.’s and postdocs also adapt to solve problems in their environments
  5. Ants are very strong. Ph.D.’s and postdocs have strong brains at least
  6. Ants can have fungi infect their brains. Ph.D’s and postdocs seem very prone to depression, anxiety, impostor syndrome and other brain issues (I’ll raise my hand and say I have had to deal with these things & I don’t think I’m alone)
I like this analogy…there’s at least dignity in it (ants are cool!). It acknowledges the difficulties faced by individuals and can be put into an ecosystem context- currently there are too many ants for the resources available. I have no idea if this actually ever happens in the real world. It is certainly the case with Ph.D’s and postdocs. Hopefully it’s not so dire that only the lucky few make it to ‘old age’ as it might be with an ant colony. 

I also realize that ant colonies are all female…which is obviously not the case amongst the scientific population (especially at the faculty level), but that may be taking the analogy to the breaking point. 

I might well be crazy here, but it was fun to think about this. 

As an addendum, I’ve basically disabled comments here- I’d prefer to interact via Twitter. Such a great medium for discussion.