Are you the publisher? Claim or contact us about this channel

Embed this content in your HTML


Report adult content:

click to rate:

Account: (login)

More Channels

Channel Catalog

Channel Description:

A feed of all posts from all FOS blogs.

(Page 1) | 2 | 3 | .... | 198 | newer

    0 0

    A favorite small shrub in our glasshouse is this week's Friday fabulous flower, commonly called "minerature holly" (Malpighia coccigera) because of the glossy, spiny margined leaves. Like many members of this family (Malpighiaceae) the flowers have spoon-shaped petals, here ruffled and fringed to boot, and while not large they are produced in great numbers making for a very attractive display. This species, native to the West Indies, is used as a tropical ornamental. The fruit is a red berry and edible, as are several larger species whose fruit are better known as Barbados cherries or acerolas. The flowers have a funny fragrance, not altogether pleasant. In between the upper three petals glandular sepals can be seen that secrete an oily substance as a reward for the bee pollinators. The genus is an honorific for Marcello Malpighi (1628-1674), professor and naturalist at Bologna. He was an early anatomist making contributions to both botany and zoology (malpighian tubules).

    0 0

    OK, I finally have conditions where GFAJ-1 cells grow reproducibly in medium containing 40 mM sodium arsenate: tightly sealed screw-cap glass tubes or bottles, half-full, gently rocking at 37 °C.  

    It's sadly true that I lack any insight into why the cells wouldn't grow in polypropylene screw-cap tubes, or in flasks, or why sometimes they wouldn't grow in anything.  Since Wolfe-Simon et al. grew their GFAJ-1 in screw-capped glass tubes, I think I'm adequately replicating their growth conditions.

    So now I've grown big batches of cells in bottles and extracted DNA from them.  My collaborators tell me they'd like to have about 50 µg of DNA for their cesium chloride gradient purification and mass spectrometry analysis.  This requires starting with about 2 x 10^10 cells, given an estimated genome size of about 3.7 megabase pairs, and allowing for some losses in purification.  For cells growing in phosphate-rich medium I figured 50 ml of culture would be enough (one with arsenate and one without), and for phosphate-limited cells I tried 500 ml.  Harvesting the cells turned out to be a bit tricky, because when I centrifuged them they formed a very loose pellet - I had to take the pellet with about 10% of the supernatant and centrifuge again. 

    I did only a crude DNA prep, by my standards, but the DNA is much cleaner than one sample in Fig. 2 of the Wolfe-Simon et al. paper.  I lysed the cells with lysozyme and then 1% SDS, extracted them once with phenol and once with phenol::chloroform, and added NaCl to 150 mM and 2 volumes of 95% ethanol, all as Wolfe-Simon et al. did.  But instead of centrifuging the now-insoluble DNA and RNA, I spooled the DNA fibers out onto the tip of a sealed glass pipette, rinsed them with 70% ethanol, and air-dried them.  (I also added RNase A with the SDS to degrade the RNA.)  Spooling can only be done with chromosomal DNA, because it requires long fragments at high concentration, but it's the method of choice because it leaves behind all the non-DNA insoluble material that centrifugation would pellet with the DNA.  

    I then resuspended the DNA by swirling the  in Tris-EDTA and gave the clumps time and pipetting to help the fibers disperse.  I checked the concentrations using the wonderful NanoDrop spectrophotometer, and ran about 200 ng of each prep in an agarose gel to check its quality (length and cleanliness).  The gel photo below shows the results - clean preps of DNA fragments longer than 30 kb (the top size standard is the 27.5 kb HindIII fragment of Lambda DNA).

    I have almost a mg of the DNA in the rightmost lane.  This was a separate prep - I hadn't yet discarded the high-phosphate plus arsenate cultures I'd done several days before (see Growth!), so I just pooled them all, collected the cells, and did a parallel DNA prep.

    One problem with the cultures was that the phosphate-limited cells without arsenate looked very sickly when I harvested them, with orangish clumps of debris in the medium after two days growth and many misshapen cells seen under the microscope.  And I only got 7.6 µg of DNA from this culture .  So I inoculated a new culture, this time using 1000 ml divided between three bottles.  Again the culture looked bad, but I was more careful in harvesting the cells, and would up with about 132 µg of DNA.  So on Monday I'll send 50 µg of each DNA to my collaborators.

    The critical test will be assaying for arsenic in the DNA from cells grown with limiting phosphate and 40 mM arsenate, since this is the condition that was claimed to cause arsenic to be incorporated into DNA.  I'm not sure how important the other culture conditions will turn out to be - if we detect no arsenate at all in this DNA, will we really need the other conditions to make our case?  But if we do detect arsenic in this prep, these will be controls for background arsenic levels.

    The other odd think about my cultures was that the cells with 40 mM arsenate grew better than the cells without arsenate.  This could just be an effect of ionic strength, since I put an equivalent volume of water in the no-arsenate cultures, so I'm going to do a careful dose-response curve with a wide range of arsenate concentrations, using NaCl to keep the ionic strength constant.

    0 0

    What if all those vaccines - those ones that work really well - all stopped working? Imagine if the viruses and bacteria from which they are trying to protect you against, evolved and adapted to life in a largely immune population? Those robust antibody and T cell responses generated within a person following vaccination supply the perfect breeding ground for the selection of resistant mutants where anibodies can no longer recognise and neutralize their targets and where T cells fail to eliminate infected cells. So, is it possible and is it happening?

    Well, we already know this kind of phenomenon from influenza, right? Every year we have to change the strains that are put into your flu jab to match those viruses predicted to be circulating come winter. This is based on generating an antigenic match of vaccine to wild virus; specifically, their surface HA proteins must look the same. This is why there has been such a push to develop universal influenza vaccines capable of immunizing people against all flu strains. For viruses like measles and mumps however, we have our universal vaccine, or at least we thought we did.

    Influenza may change through antigenic drift and shift forcing us to develop new vaccines each year, but do other viruses evolve through antigenic drift and force us to generate improved vaccines for them?

    One of the fears behind the recent examples of mumps outbreaks in populations across the world where there are even very high levels of vaccination is that it may provide the very breeding ground for vaccine scape mutants. We have even been noticing this with other vaccines: pneumococci, hepatitis B, and hepatitis A.  It prompts us to ask the question: are current mumps viruses able to get around vaccine-induced immunity, and if so, do we need to develop new vaccines?

    I've talked about the hypothetical reasons that may explain the recent mumps outbreaks and one of these was that current mumps viruses are adapting to life in a human population that is immune to their infection, i.e it is evolving to escape our vaccine-induced immunity. This is an extremely important question as - if true - jeopardizes half a century's efforts in mumps elimination and in itself poses a significant public health problem. And as demonstrated by this recent paper, countries are already searching for new and improved mumps vaccines to get around this issue - they are rapidly taking a leaf out of influenza's book. Yet perhaps it is too early to do this.
    Secondary vaccine escape - where despite the vaccine generating sufficient immunity to a pathogen, the virus or bacteria fails to be neutralized during an infection either due to loss or waning of immunity or through changes in the viral antigenic proteins.  However, this has been debated for viruses like mumps as it was always thought to be monotypic, that is immunity to one mumps virus type will inevitably protect you against every other mumps strain out there, basically: to your immune system, all mumps viruses look identical. This is why our mumps universal vaccine works so well.

    This thinking has run contrary to what's known about the great genetic diversity present in mumps - and really any RNA virus out there. When we explore the genetic sequences from all mumps viruses discovered, you can organize them in to a number of groups known as genotypes. These genotypes form clusters of closely related virus sequences of which there are around 12 or 13. Yet it has always been thought that immunologically speaking, this diversity didn't matter. This is especially relevant when you realize that our mumps virus groupings are based on an immunologically irrelevant viral gene. But consider that the most used vaccine strain, Jeryl-Lynn comes from genotype A, what are the odds that a virus from another genotype would look different and be able to get by the Jeryl-lynn induced immunity? This thinking gets worrying as no such outbreaks have occurred by viruses of the same genotype of the vaccine strain.

    In order to get a better understanding of this, researchers from the Food & Drug Administration and Queen's University, Belfast (Disclaimer: this is the group I am a part of) decided to look into this in more detail. In a relatively simply designed study (published here, in the Journal of Virology), they took antibody serum from children recently vaccinated against mumps and used it to try and inhibit infection with a panel of viruses representing known mumps virus antigenic diversity.

    Doing this they would be able to tease out whether genotype A-induced immunity was substantially able to protect against the virus in vitro. In doing so they also determined the major antibody targets of mumps vaccination using a range of recombinant viruses.What they found was that vaccine-induced immunity was able to effectively neutralize all groups of mumps viruses despite slight antigenic differences, i.e even though the viruses looked a bit different, they all looked sufficiently similar to the vaccine virus to be inhibited. So can we still say that secondary vaccine escape is occurring?

    There are a few things to be aware of in this study: one, they focused on recently vaccinated children yet all outbreaks occurred in University-age Adults when they will significantly lower antibody responses due to waning immunity. Remember that secondary vaccine escape involves both loss of immunity and changes in viral proteins so how would the above results differ if lower antibody responses were considered? We'll have to wait for that, especially as this has already been suggested - along with antigenic differences - to play a major role in these outbreaks. The final point is that they only studies B cell response and not T cell ones, although it is thought that T cells play little role in protecting against mumps.

    But what this work does add to is the already growing precedent that current mumps vaccine regimes - while generally working very well in the past- may not be sufficient to protect against and eradicate all strains of mumps in the future when waning immunity is taken into consideration. Luckily, we can get around drops in antibody levels through catch-up booster vaccines, so should we be looking out for MMR catch-ups in the coming future?

    ResearchBlogging.orgRubin SA, Link MA, Sauder CJ, Zhang C, Ngo L, Rima BK, & Duprex WP (2011). Recent mumps outbreaks in vaccinated populations: no evidence of immune escape. Journal of virology PMID: 22072778

    0 0
  • 11/18/11--15:54: In search of violations
  • This week high-energy physics published some interesting results for our fundamental knowledge of the universe. On 14th November, LHCb published the preliminar analysis about a possible CP-violation in charm decays (see also the CERN's Bulletin and Mat Charles' presentation).
    The researches about CP-violations are very important because in this way we can argue the differences between matter and anti-matter. If a physics law is CP-invariant, we must write that the beahvior of matter and anti-matter is the same. But our universe is constituted by matter and we don't know why it is so. The answer could be in CP-violation studies, like the preliminary data analyzed by LHCb team. The main goal of the experiment is the search of the properties of quark b, but he could also measure the properties of quark c. And studying the preliminary data about c decays the team find a clue of a CP-violation in a non expected channel. Following Tommaso Dorigo and Marco Delmastro (english translation by Google), if the result will be confirmed by further analysis, this could be the first sign of physics beyond Standard Model.

    The other possible violation is the wall of the speed of light: indeed, OPERA experiment confirm their previous data. Yesterday, in the updated version of their famous preprint, OPERA's researchers described a new serie of measures realized with CNGS using a short-bunch wide-spacing beam.
    The new measures confirms the previous observations: some superluminal neutrinos arrive in OPERA's detector. Their advance is $62.1 \pm 3.7$ ns for the bunched beam test and $57.8 \pm 7.8$ ns for the main analysis. We can see the distribution of $\delta t$ in the second analysis in the following plot:
    About this result, Giovanni Fiorentini from Ferrara's INFN say to Le Scienze, the italian counterpart of Scientific American:
    But the glass is half empty because the neutrinos' beams are very close when they leave and they should be short also when they arrive: on the contrary, it likes that some of them flight a little more and others a little less, as if gthere was a scattering; this probably reflects the fact that the temporal resolution of OPERA's detector isn't nanosecond, but this is a bit worse than expected, altough not so much to affect the result: we can say than the test has been well at 70 percent but not 100 percent
    But following Philip Plait:
    However, they used the same timing apparatus, and a lot of people - me included - think this is where the problem lies. They need to figure out a way of making that more transparent and perhaps using a different timing method.
    So we must wait MINOS team: they are preparing experiment to perform a new measure of neutrinos timo of flight.
    Other neutrinos' links
    Nature's blog
    Tommaso Dorigo
    Sascha Vongehr
    The neutrino's saga on Doc Madhattan:
    News from the OPERA
    Probably not
    Waiting supeluminal neutrinos: from Maxwell to Einstein
    Waiting the superluminal neutrinos (if they exist!)

    0 0
  • 11/19/11--01:52: Recycle plastic pots
  • A local green action organization is having a recycle everything you didn't think could be recycled day, and one of the things that can be recycled are plastic pots. The Phactor knew that if he saved them up long enough, in big stacked sets, they'd be good for something, or someone would get around to recycling them. Apparently plastic pots can be recycled into plastic landscape "timber"among other items. How appropriate. And that's good because you would like to think that gardening is a pretty green industry and there's all those plastic pots. A few plant providers have switched to pots composed of organic materials that just decompose, although some seem to take too long. More alternatives are needed. Now to load up the stacks and destroy the evidence of my plant buying problem. It's sort of like hiding those candy bar wrappers in the garbage. Actually it can be slightly embarrassing when recycling all the wine bottles too. It's not that so much wine is consumed, just that our recycling is done so infrequently.

    0 0

    Once upon a time, there was a wicked and vain queen with a strong desire to murder her stepdaughter. However, through a convoluted series of events involving a huntsman, seven dwarves and a handsome prince, the stepdaughter survived. In fact she ended up engaged to the handsome prince. As a wedding present to his new bride, the prince invited the queen to the wedding. A show of forgiveness? Not exactly.
    As part of the nights entertainment, the queen is forced to wear a set of red hot iron shoes and dance for the amusement of all the guests. Until she drops dead. The moral of this story?

     Don't f*ck with Snow White.

    Whilst this may seem like a cruel and unusual punishment for us, dancing yourself to death was a genuine fear for people when the Snow White legend came into being.
    A medieval medic, known as Felix Platerus records a curious case involving a young lady. She appeared to have a supernatural compulsion that forced her to dance. Guards had to attend to her for a whole month to prevet her hurting herself, or others. She danced  until her feet were rubbed raw, and she could no longer stand. And then when she could stand, she started dancing again. A few of the writers of the time make references to Arabic descriptions of "Jumping limbs" and palsy, and a number of writers tended to notice the convulsiveness that characterised these dances.

    In northern europe, it was believed to be caused by some form of demonic possession, and sufferers had to pray to St Vitus (the patron saint of dancers) to be relieved of this disease.  In Italy and Greece, it was believed that the cause of the disease was from the bite of a wolf spider. The remedy for this disorder was a frenzied dance which eventually became known as a tarantella.

    It was not only these nations that took up dancing as the cure for this disease. In parts of Germany, magistrates employed people to play music for these people, and for others to act as dancing partners. This was to allow them to "dance out" their disease.
    Choruses of sufferers roamed the streets. Musicians followed them, attempting to soothe their disease by providing rhythm. The greatest challenge in managing this chorus was ensuring the safety of the people who ended up collapsing from exhaustion.

    In Strasbourg 1518, an outbreak of this fever prompted the council to set up a guildhall especially for the dancers, and paid people to dance with them, and others to play music for them. Yet still, many danced themselves to exhaustion, and death. Eventually, the Strasbourg council realised  this treatment was not working. Still unaware of the cause of the disease, and wanting to prevent contagion, the city banned all dancing. The sufferers who still danced in spite of the ban were sent on a pilgrimage to the shrine of St Vitus. Once they arrived they received holy oil, and red shoes.
     These choruses of dances lead to a particular name being ascribed to the disorder-  a Chorea. The literature documenting this disease is so littered with allusion and innuendo that it is difficult to get to the truth of the source of this illness.

    And into this quagmire strolls the hero of this particular story. His name was Thomas Sydenham. He was a physician of the practical sort, whose personality and career was wrought in the events of the English civil war.

    The English Civil War can be said to have begun with one man.  When King Charles came to the throne, he had high ambitions. He wanted to finish his father's work, to bring unity between Scotland, England and Ireland. Furthermore, he wanted to enter the brutal "Thirty Years" war that had been raging on the continent.
    But what was more controversial was how he dealt with parliament. At the time, parliaments control was far less than it was today. It collected taxes, and controlled how they were spent. If the king wanted to use any of that money, he needed to consult with parliament. Charles however, was believed in the king's "Divine Right" to rule over his subjects, and that he should answer to no-one but God himself. Needless to say, there was some friction between the King and his parliament.

    One example is when Charles demanded funding for an expensive military campaign on the continent. Parliament, initially supportive of this course of action, questioned him over putting one of his friends in command, instead of more competent individuals. Their fears were confirmed when the war ended in an expensive defeat. When parliament attempted to bring charges against the man responsible, Charles dissolved parliament.
     Such exchanges began to characterise the King's rocky relationship with his kingdom. Charles needed parliament to fund him, but in return they would want a say in how the money was spent.  So the king stopped convening parliament, and attempted to rule the kingdom as an autocracy. He levied new ( and illegal ) taxes upon the populace, as a replacement for the parliamentary purse. He instilled a new religious doctrine, and began to persecute those who did not adhere to it. For those in religious minorities, this was a worrying chain of events. Puritans, such as Thomas Hooker, sick of the religious persecution, emigrated to America.
      The Scot's also did not take these changes lightly. Soon, they were in open rebellion, and the King didn't have the money to fight them.  So after eleven years of "personal rule" , Charles was forced to convene parliament again, to secure more funds.
     Various members of parliament saw this as an opportunity to bring their king under control, and introduce more reforms. It did not turn out that way. Predictably, this resulted in another quarrel, and in less than a month, parliament was once again dissolved.
    The king went to war anyway with what little funds he could muster. The ensuing fiasco ended with Scotland in control of the North of England. The Scots, magnanimous in victory, promised they wouldn't burn all of the cities in their possession so long as Charles paid them a ransom.
     The King couldn't just let the north burn. He needed to reconvene parliament . In return for the money, reforms were enacted that aimed to end the king's autocratic habit. But Charles would not crack so easily.
     By the time Thomas Sydenham began his studies in Oxford, events had come to a head. In January 1642 King Charles had marched with 400 men to parliament with the intention of arresting 5 of it's members, but without success.
    Across the country, towns and constituencies began declaring fealty to either king or parliament. War was brewing.  Parliament began recruiting an army to do the unthinkable. To fight against their own king.
    In the early stages of the war, Sydenham's mother was killed when Royalists raided their home. This act which drew the whole family deeper into the conflict.
    Sydenham left Oxford, and joined his family to declare loyalty to parliament.  Their exploits during the war lead them to be known as"The Fighting Sydenhams".

    Thomas's older brother, Francis Sydenham  became a legend in the Parliamentarian army. When attacking the seemingly impregnable town of Corfe, it was Francis who oversaw the construction of tank, which was used to great effect to assail the city.
     At one point, Francis pretended to defect to the Royalist side against his own family. In doing so, he tricked Royalist forces into entering a devastating ambush in Poole. And during that battle, by chance he encountered the man responsible for the death of his mother, and killed him on the spot*.

       Thomas's role in the war is largely overshadowed by those of his brothers and his father. It is known that he was wounded in a cavalry charge at the battle of Weymouth as the war was drawing to a close.  By the time the royalists had surrendered and the king executed. Two of his brothers, including Francis, were dead. It was a new England, with Oliver Cromwell and parliament in charge.
    Thomas Sydenham returned to Oxford to take up his studies once more, and found it a much different place. It had been one of the last royalist strongholds to fall, and Parliamentary forces aimed to keep it under strict control.   Parliament had put "visitors" in place, who enforced parliament's political will on campus. They expelled anyone who spoke out against the new government.

     Thomas, being a leading member of the revolution, thrived in this environment. Parliament were eager to bestow some commendation upon him for his family's role in the war. As a result,  ended up with a doctorate within his first year of study. In addition to this, he may also have played a role in helping Cromwells "visitors" in expelling professors on the basis of their royalist leanings. This removal of old academics aided the careers of younger academics such as John Locke and Robert Boyle, both of whom were friends of Sydenham.

    Nevertheless, the army was not yet done with Sydenham. Whilst Cromwell had successfully won the civil war, winning the peace was a different matter. After returning from a brutal and bloody campaign in Northern Ireland, Cromwell turned his eyes north. The Scottish parliament were jockeying for more power in the new order, and recruited the heir to the throne, Prince Charles II to force the issue. In 1651, Cromwell answered with an army of seasoned veterans. Once again, Thomas Sydenham found himself at war. He was put in charge of his own battalion, where he used his medical knowledge to fortify his troops against the northern frosts. But this didn't stop him from being left for dead on a battlefield and nearly shot at point blank range by a drunken trooper from his own side. When Scotland was defeated, Thomas left the army, never to return.

     He devoted the rest of his life to the study and the application of medicine, and managed to revolutionise the way it was practised in England. At the time, there were many explanations of diseases, and often doctors would prescribe treatments accordingly.  The vast increase in anatomical knowledge lead physicians to prescribe bleeding to balance the "humours" that allegedly controlled the body. Various beliefs on matter, and findings in chemistry lead to mercury being used as a purgative. The best "treatment" for a convulsive jerking disorder at the time was to "dance" the demons out.

    When the founder of the British museum, Hans Sloane applied to be a student of Sydenham, he presented the physician with a letter of introduction . Sydenham glanced at the letter. It indicated that Sloane was "a ripe scholar, a good botanist, a skillful anatomist".
    "This is all very fine," Sydenham cried "but it won't do! Anatomy ? botany! Nonsense! Sir, I know an old woman in Covent Garden who understands botany better, and as for anatomy, my butcher can dissect a joint fully as well. No, young man, all this is stuff; you must go to the bedside; it is there alone you can learn disease."

    He recognised that many of the findings of anatomy and science were very much in their infancy. The application of their principles when they were so poorly understood was foolhardy. He staunchly believed that evidence was the guiding factor at the bedside.

    "It is my nature, to think where others read; to ask less whether the world agrees with me than whether I agree with the truth" he says in one of his works.  He flatly states that to speculate on the causes of diseases was a "difficult and perhaps inexplicable affair; and I choose to keep my hands clear of it".

    He was one of the few men of the day who recognised that they simply didn't have the tools to speculate on the causes of disease, and that the bedside was a far better teacher than the library.
    What Sydenham frequently found was that the application of unproven theories to the treatment of patients did more harm than good, and that often the best thing to do was to let nature take its course.

    Sydenham had very little patience for nonsense. So one wonders what he thought when he first heard of the dancing fever. When he first encountered a patient,  He did not speculate on whether it was caused by demonic possession, or by a spider bite.  He focused on collecting as much information as possible about the disease, and then letting the evidence speak for itself.  He described the patients as exhibiting convulsive movements out of their control,** . Instead of prescribing dancing, or prayer, he simply let the disease run its course, and found that patients would eventually get better without interference. The deaths attributed to this disease were probably caused by the forcing of sufferers to "dance out" their disease. Whilst he did not know it at the time, this dancing disease was related to another disease he studied, the disease that is the subject of this series.

     It was Thomas Sydenham who gave scarlet fever it's name***. His initial studies on epidemiology in 1666 are where he first describes the disease, although the actual name “Scarlet Fever” only appears in the 1683 edition. . He noted the seasonality of this disease, and made key observations on it's disease course, and that the treatments of the time were at best ineffective.
    Unfortunately, his description of the fever leaves out many of the essential details that were catalogued by Sennert. Sydenham's failure to recognise the more severe forms of this disease (such as the dropsy and arthritis) meant that subsequent physicians often confused it with measles. Such was Sydenham's repute that his word was deemed to be the last one on this issue for over a century.

    Nevertheless, his work was revolutionary. The Dancing fever would eventually be named after him, and called Sydenham's Chorea, and the wide regard of his work raised awareness about Scarlet Fever.
    Unbeknownst to him, the two diseases are linked. Previously, I talked about how Sennert was amongst the first to recognize that scarlet fever can trigger other diseases, such as "dropsy" and rheumatism.
    Today we know that these occur because the bacteria that causes scarlet fever, Streptococcus pyogenes, often uses a variety techniques to hoodwink the immune system. In some cases it can turn the immune system against the rest of the body. These immune reactions are what cause these "disease sequels".

    In the case of Sydenham's chorea, the immune system has a bad reaction with the nervous system, causing damage. This manifests itself as the juddering dancing motions characteristic of the disease. It usually resolves of its own accord after the underlying bacterial disease is cleared. So Sydenhams basic treatment of "Wait and See", or even the religious method of "Wait and Pray" would have worked far better than other treatments available at the time.

    But what Sydenham did for medicine was not limited to his descriptions of diseases. Indeed, in many of his writings he often proclaims his ignorance on many subjects. It was the way he practiced medicine that secured his place in history. His re-introduction of Hippocratic methods of evidence gathering were the key to his success.  After his death, the following was written of the great practitioner;

    "The great merit of Sydenham, was to proclaim the great truth that science was, and always must be incomplete; and that danger lurks in the natural tendency to act upon it as if it were complete. A practical man has to be guided not only by positive knowledge, but by that which is imperfectly known. He must listen to the hints of nature, as well as to her clear utterances. to combine them may be difficult; but the difficulty is solved in minor matters by the faculty called common sense; in greater affairs by the synthetic power of Genius " ***

    * In one account , it was said to be Thomas himself who avenged his mother, but sources that are more reliable indicate that it was actually Francis.
    ** There seems to be a bit of controversy over the actual causes of the medieval dancing disease. Whilst  the diagnosis of Sydenham's Chorea may make sense in the context of the devastating streptococcal outbreaks pushing their way through Germany at the time (as documented by Weyer and Sennert), it does not explain everything.
    The medieval descriptions of the sufferers of Chorea differ much from Sydenhams own account. They talk of sufferers leaping and so forth. It is the impreciseness of language that makes it hard to definitively describe the disease. It appears that only arabic accounts agree with Sydenham's description, but I have not been able to source the relevant documents to back this up, and even if I did, I can't read arabic.
    It is also likely that hysteria may have played a role, especially where people were employed to dance with these victims. In fact, the dancing bans in Strasbourg likely helped to quell the spread of this hysteria.
    There are also theories that a rye fungus outbreak spread hallucinogenic substances on foodstuffs, although nothing in the accounts I read indicates that these dancers were actually hallucinating.
    The truth is that there is no truly satisfactory explanation for this disease. Sydenham's explanation was good enough for those who lived contemporaneously with the disease, so I decided that it was good enough for me.

    ***Samuel Pepys is probably the first person recorded to use the term "scarlet fever" to describe a disease, it is unlikely to refer to the disease we know today. It may have been used interchangeably with measles, and other diseases that produced a red rash. It should also be noted that colour based adjectives were often imbued with some form of emotion, with "scarlet" and "black"  being not only descriptive, but emotional terms. When researching these articles, I was struck by how almost every disease was described as a "black plague", including those which exhibited streptococcal symptoms. But in some cases, the writers may have well have said "terrible plague" or "scary plague". This use of language makes investigating diseases in the past a very confusing affair to the novice.

    *** I have scoured the internet for the source of this quote, and came up with zip. Any help with this would be appreciated.


    Google Books

    "A History of Madness in Sixteenth-Century Germany" By H. C. Erik Midelfort

    "Anatomy of Melancholy," by Robert Burton

    "The continuum companion to Locke", by Sami-Juhani Savonius-Wroth, Paul Schuurman, Jonathan Walmsley

    Dr Thomas Sydenham- by Unknown.


    History of Scarlet Fever, by J.D. Rolleston in BMJ

    "Thomas Sydenham: The English Hippocrates" by William J Fisher in the Canadian Medical Association Journal 

    0 0
  • 11/20/11--02:23: Paramecium, nice to meet you
  • How many of you recall one of the first cool science related thing you experienced? I bet if you think about it, even if you no longer give a rat's ass about science, you can come up with something from childhood. Maybe seeing puppies or kittens being born, watching a frog or butterfly develop from a tadpole or caterpillar respectively, seeing light split into diverse colors through a prism.

    Malamute puppies


    I can think of two things that got me hooked on the wonder and awesomeness of biology. One was 'discovering' my brother's microscope. It's a single eyepiece light microscope. I still have the beast and it still works, although it needs a new bulb. Once this device was discovered, it opened a whole new world to me: pondscum. That was when I was first introduced to a beautiful little beast. I didn't know its name or even what the hell I was looking at. What I did know is that it was love at first sight. Now I admit our relationship faltered when I met colecovision and was ruined when I realized the opposite sex was more than just a cootie factory. However, it ignited a longing that burned deep within me, forever influencing my...well let's not get overly dramatic.

    The shear awesomeness that comes from seeing these little beasts swimming around in the pond behind your house with your own eyes, well eye since it was a monocular scope is inspiring. Here's a video using a much better scope than I had, which I hope can give you an inkling into that sense of wonder that arose in a child.

    There are many aspects of Paramecium biology worthy of discussion: separation of 'somatic' and 'germ line' nuclei, the trichocyst, digestive progression, whole genome duplications, macronuclear development, RNA editing, endosymbiosis, etc. We will touch on a few of these in the next few weeks.

    0 0

    The petunias in the window boxes outside our front bedroom windows after producing a pink cascade for at least 5 months died two nights ago. This is not so surprising actually since petunias are not cold hardy at all, so the first good frost or freeze of the fall always does them in. But what was surpising is the date, November 18th, a very late date for the first killing freeze (about 25 F). The weather had flirted with frost a few times, but here in our urban heat island it was never frosty enough. Most gardeners recognized how late the season was in a variety of ways, and then they nod and say "global warming". While the Phactor is positive global warming is real, such departures from normal averages are just weather. Over the long haul more frequent deviations produce a climate trend. Models of global warming predict more extremes, hotter hots, colder colds, wetter wets, dryer drys, early snow storms, late freezes, and so on, but greater amplitude in weather may not cause a change in the means as they tend to average out. 2011 had a late spring, cold and wet, and it pushed flowering back (in comparison to 2010) but then a very late mild fall that was very good for the baby bok choi.

    0 0

    Interesting little commentary on how (not) to do a résumé: Final Cut: Words to Strike from Your Resume.

    I don't have a résumé, 'cause in science that isn't expected. I have a CV, and the difference is that there we just list pretty much everything we've ever done, as opposed to write about how great we are. But I used to have one when I was working as a programmer, and I made just the mistakes that Elizabeth Lowman cautions against:

    • Saying what you hope to do in your next job (you should list your top accomplishments)
    • Saying you're experienced (you should give details of that experience)
    • Saying you're a team-player (you should give examples of how you have been a successful team-player)
    • Saying you're dynamic and energetic (you should accurately describe your skills instead)
    • Saying references are available upon request (you should assume that the prospective employer knows this)
    These all sound like great points to me, and I hereby pass them on. However, my point with this post is to ask what use it really is to do so. What use is it really to do so? Why should anyone give away this advice at all?

    The reason I ask is that the job-market surely is a zero-sum game. There are at any one time only a finite number of jobs, and presumably those jobs will be filled. At least, those that won't probably aren't filled because applicants didn't have the world's best résumés. Of course, passing this advice on to your friends has a direct benefit to you - that is, if you care more about your friends than about everyone else. But giving away this advice on the internet? Is that because one cares more about job applicants who read stuff on the internet?

    It's seems a little like commercials and advertisements: Who do they benefit? Ignoring the fact that commercials are very often misleading and deceitful (and if you don't know which ones tell you lies, then how can you trust any of it?), nearly the best thing one can hope for is that they shift market shares. A new detergent - no different from the nine brands already on the shelves - can conquer a large percentage (though not more than 100) of the market with an enticing ad campaign. But it doesn't make people do more laundry, does it? (Not that that would be a good thing, either.) Rather, it takes away market shares from other companies. And it's not that I care which company survives, or even that we have 9 rather than 10 that do (though I guess I don't prefer monopolies), but I care that so many resources are spent making and watching commercials. It is, like coffee, wasted resources on a planet that is already not able to feed everyone the way its resources are currently managed. If the people who make commercials were made to do something useful (and lands for coffee beans were instead used to grow food), perhaps we could make this place a little bit better.

    So why? Who benefits from giving away this information? The answer is, of course, that Forbes and Elizabeth Lowman benefits. It's a known fact in evolutionary theory that what benefits the individual often doesn't benefit the whole population. What is good for me is not always good for society. It needn't be bad for society, as in the case of this advice being given away in Forbes, but it definitely does look nice in Elizabeth Lowman's résumé. And I'll admit that it is good to have an educated and competent citizenry, and it certainly doesn't hurt that people can sell themselves well in their job application - as long as it doesn't skew the hiring results in a way that makes it less likely that companies will hire the best person for the job.

    Nevertheless, the résumé building advice is hereby passed on, and I think it's pretty good advice, too. And you may very well then ask me why I blog about this in the first place. What's in it for me? And the answer is that I don't know. I just felt like it, I guess. There are definitely people who I'd like to do well, such as friends currently looking for jobs. I could of course just have sent you an email with a link to the article, but I know you read this...

    Good luck hunting for jobs!

    0 0

    While many people act, and drive, as if their automotive vehicle (aka car) is an extension of them selves, a statement about their personality and immaturity, cars hold no particular fascination for the Phactor. Fortunately my use of a car is far, far below average, although owning one remains a necessary convenience, but basically cars are in the same category as toasters, just a lot more expensive. You want one that daily delivers a nice even golden brown on both sides and accommodates bread, bagels, and baguettes with equal ease. However if you cannot park one yourself, you shouldn't be driving one. Decent toast is life requirement because it is how you start your day. The best toast in my memory was a perfect baguette served with a platter of tropical fruit and a very, very good cafe con leche on a patio overlooking the Pacific coast of Costa Rica (think rainforest meets Big Sur). OK the location probably had a great deal to do with the memory, but it was good toast. Having been blessed by location, shrewdly chosen, the Phactor does not drive on a day to day basis having lived within walking distance to his work place for 40 of the past 42 years. As a result over the past quarter of a century, the Phactor has only owned two vehicles, both were quite reliable and served my purposes quite well, but as my blog is not monetized, no endorsements are forth-coming. But today sitting in a waiting room of an automotive service establishment reminds one how annoying vehicles can be if not reliable. In this particular case after nearly 9 years and 55,000 miles (only), and with winter approaching, the car needs new tires. Even still this place is bleak beyond belief, worse than an airport and without the people watching. If only the toaster could be balanced for browning as easily as these tires. Does a good toaster actually exist? How can someone have such good fortune with vehicles and yet be continually displeased with toasters? The current machine toasts with total indifference, a perfect lemon, whose most redeeming quality seems to be its color. So if anyone can recommend a make and model both competent and reliable it would be appreciated.

    0 0

    A brief update today: I've written twice before about the mistaken hypothesis that chronic fatigue syndrome (CFS) is caused by a virus known as XMRV. After many followup studies failed to replicate the original findings, other scientists finally determined conclusively that XMRV was a contaminant in the original cells used in the experiments. Lead researcher Judy Mikovits continued to claim she was right and that everyone else was wrong, despite the evidence, but in a surprising move less than two months ago, all the authors (including Mikovits) retracted the paper. (Actually it was a "partial retraction", but they did admit that XMRV was a contaminant which pretty much blows up the whole claim.) Science is now investigating whether some of the data in the paper was falsified, as Trine Tsouderos reported in the Chicago Tribune last month.

    In a bizarre twist in this saga, Mikovitz was arrested and thrown in jail on Friday in California. Science magazine's Jon Cohen reported that her former employers, the Whittemore-Peterson Institute, which fired Mikovitz on September 29, filed felony charges against her in Nevada for stealing their laboratory data. It appears that WPI claims Mikovitz kept data about her experiments on her personal computer and has refused to give it back to WPI. Mikovitz' lawyer denied the charges.

    I suspect this isn't the last we'll hear of this story. But the science is done: XMRV isn't the cause of CFS, and the search for a cause continues.

    0 0

    So here's the situation, the hydrogenated Glycine max oil had been creamed together with crystallized, purified vacuolar sap from Saccharum, two sterile ovules of the domesticated Asian jungle fowl, and an ethanolic extract of fermented fruits of the Vanilla orchid, so it was time to stir in the shredded endosperm of Cocos nucifera, the chopped embryo of Juglans regia, and the candified, ground seeds of Theobroma cacao, only to find that the most critical ingredient, finely powdered endosperm of Triticum spp., was insufficient to the task at hand. Now some 5 hours earlier that the Phactor's vehicle was checked into an out patient auto center to get new tires, so fortunately the people's grocery is only about a 20 min walk away, but walking home with 11 lbs of ingredients indicated the utility of having at least a cart. The preparation time certainly did not take into account not having all the ingredients on hand. Still Ms. Phactor and F1 seemed to approve.

    0 0

    Like victims of catastrophic head injuries, people with synesthesia often appear in neuroscience papers identified only by their initials to illustrate the mysteries of the brain. But synesthesia's not a freak occurrence. It's estimated that 2-4% of people have abnormal connections between their senses. The condition may not be an accident at all, but a trait that evolution has retained for a reason.

    The authors of a new review paper, David Brang and V. S. Ramachandran, ask why synesthesia has survived. Since it runs in families, synesthesia seems to be partially genetic. But it appears in many different forms--more than 60 have been documented. Garden-variety synesthetes see numbers, letters, or sounds in specific colors. Less commonly, synesthetes may experience each day of the week as a certain point in space, or feel touches on their own bodies when seeing another person touched. Many genes may be involved, and the interaction between synesthesia genes and a person's environment might lead to all kinds of outcomes.

    It's possible that the genes promoting synesthesia have been kept around by evolution because they have a "hidden agenda." Another such trait, Brang and Ramachandran say, is sickle-cell anemia, which in addition to its unhelpful medical effects grants protection against malaria. Aside from the obvious sensory quirks, do synesthetes have a sneaky superpower?

    Ramachandran, incidentally, is the person who broke the news to me about my own synesthetic tendencies. During my freshman year in college, my friends and I went to a talk he was giving on campus. We settled into folding seats, ready to hear about an exotic cognitive phenomenon. "For someone with synesthesia," Ramachandran explained to the auditorium, "the number 3 might always appear as red."

    Lame. I leaned toward my dorm-mates. "But 3's are green," I whispered. They turned to stare at me. "Oh," I said.

    Ramachandran then projected a screen full of 5's and 2's, printed as if on a digital clock, square-edged reflections of each other. I was distracted by a weird illusion: Although the numbers were all in black, there were flickers of maroon and navy wherever I wasn't looking, like the gray blobs that appear in your periphery when you look at a grid of black squares. I wondered if I was seeing a trick of light from the projector. Then I heard Ramachandran explaining, as he moved to the next slide, that this was a test for synesthetes, who could discern a hidden pattern among the 5's and 2's more easily because of their associated colors. Ohh, I thought.

    Brain research has only begun to figure out what's happening inside a synesthetic brain. For grapheme-color synesthetes (people who associate numbers or letters with colors), seeing those numbers or letters  activates a color-perceiving brain region called V4. This shows us the connection is happening on a sensory level, and not in the realm of abstract ideas. A number doesn't just remind a synesthete of a color; it triggers a color-sensing area in the brain.

    A recent paper suggested that the visual centers of grapheme-color synesthetes are hyperexcitable, responding to only a fraction of the stimulation needed for non-synesthetes. Perhaps relatedly, some researchers think synesthesia comes from lazy pruning in the brain. During development, the brain trims out extra neural connections to keep everything running efficiently. But synesthetes may have given their brains' gardeners too many days off, and the resulting overgrowth may link brain centers that shouldn't be related.

    So what superpower could an overactive and underpruned brain have? Synesthesia is more common among artists, and synesthetes tend to be more creative than others. Maybe today's artists were a previous era's tool-builders, chipping stones into new shapes and getting a bigger share of mastodon meat in return. Or maybe evolution has never selected for artists, and creativity is just another side effect of the synesthesia genes.

    As a more convincing superpower, synesthetes might have enhanced sensory perception. Grapheme-color synesthetes, for example, are especially good at detecting colors. Synesthetes in general also have improved memories. This especially applies to something like a telephone number, which can be easier to remember because of its associated colors. But if synesthetes' better memories extend to other kinds of sequences or details, that trait could have given them an evolutionary boost in the past. Synesthesia genes might indirectly help people perceive and remember their environments--or the experience of synesthesia itself might be what helps them.

    Personally, I've never noticed any sort of benefits (unless you count my brief and uncool foray into pi memorization in eighth grade). But synesthesia is good for an embarrassing moment now and then, such as during the occasional poker games I've played with friends. The problem with poker is that I can hear a hundred times how a green chip is worth 25 (imaginary) dollars or a blue is worth 10, but I'm never going to believe it because those colors don't fit the numbers. Other people have to help me place bets because doing arithmetic with poker chips stumps me.

    I discovered a similar pitfall when using some DNA sequencing software for my college senior thesis. Our machine read the sequence of DNA bases and returned a series of A's, T's, G's and C's. But instead of just a string of letters, the data took the form of a series of colored peaks:

    Whenever the computer was uncertain about the sequence, I had to double-check the peaks and enter the corresponding letters myself. Now, I'm not the only person who thinks A's are red; it's a common association among synesthetes. But in this software A was green. The other three letters were also wrong--almost perversely so, it seemed to me. I had to check the key another time with every base I entered. My advisor probably thought I'd had a stroke when he saw how badly I was struggling to memorize a simple four-letter code. When I put my head in my hands and groaned that T isn't red, it's just as blue as the number 2 is, he abandoned me at the computer. "I don't know what you're talking about," he said over his shoulder.

    Finding out more about the mechanics of synesthesia would give researchers insight into the working of the brain in general. Besides the obvious open questions (like what causes synesthesia and how it works), the authors point out some other areas for research: Does synesthesia exist in animals? Does everyone in the population fall somewhere on a spectrum of synesthesia? Do the genes causing synesthesia independently boost memory or sensory abilities--or does synesthesia itself benefit mental abilities somehow? And although numbers and letters can evoke colors, why is it never the other way around?

    As much as I'd love to have a superpower, I'll settle for being mildly interesting to neuroscientists while having a brain that's in one piece. And avoiding poker games.

    Image of grapheme colors: Brang and Ramachandran, doi:10.1371/journal.pbio.1001205.g001. (I did not draw this picture.)

    Brang, D., & Ramachandran, V. (2011). Survival of the Synesthesia Gene: Why Do People Hear Colors and Taste Words? PLoS Biology, 9 (11) DOI: 10.1371/journal.pbio.1001205

    0 0

    Most people - certainly most atheists - would say that one of the biggest problems with religion is that conflict you get when religion divides people who share a particular part of the world. Of course, there are plenty of examples of conflicts where religion plays a role. However, there is surprisingly little statistical evidence either way.

    Part of the problem is in trying to define religious diversity. The method most commonly used in sociological research was developed by Alberto Alesina, an economist at Harvard, and is called 'Fractionalisation'. This computers a number between 0 and 1, which is basically the odds that two people picked at random have the same religion (or race, or whatever else you are looking into).

    The problem is that this kind of diversity may not be the diversity that's important here. If everyone had their own personal religion well then, society would indeed be diverse - but it probably wouldn't trigger mass conflict.

    An alternative measure is something called 'Polarisation'. The more evenly a country is divided into two major groups, the higher its polarisation will be.

    Adam Okulicz-Kozaryn, a sociologist at Harvard, has used both of these measures to see how they interact with life satisfaction (Okulicz-Kozaryn last featured back on this blog in 2009, Happy worshippers, unhappy believers).

    He found a small relationship between Fractionalisation and unhappiness, and a somewhat stronger relationship between Polarisation and unhappiness (it's this that is shown in the graphic).

    The effect got stronger when he took into account other factors that can affect unhappiness, such as age, marital status, and national wealth.

    In fact it got even stronger after accounting for other religious variables, such as whether people attend services (increases happiness), think that religion is important (also increases happiness) or believe in God (which decreases happiness).

    Once you account for the positive and negative direct effects of religion on personal happiness, then it becomes clear that religious diversity is linked to increased unhappiness. And that's true whether you measure relisiong as Fractionalisation or as Polarisation.

    It's a big effect, too, as Okulicz-Kozaryn says:

    ...if a country’s fractionalisation index goes up by 0.25, say from the level of 0.57 for Japan to 0.82 for the United States, then life satisfaction for everybody in a country would drop by 0.25 on scale from 1 to 10. This is a big effect – it is similar to shifting 5% of a country’s population from the mildly satisfied category (6) to most dissatisfied category (1). In case of Japan it would be six million people
    He's careful to point out that this does not mean that religious diversity is a bad thing. For example, other factors that encourage diversity (openness to other cultures, freedom of speech and expression) could increase happiness.

    But it does support the belief that many people have: that religion can often serve to reinforce and even create barriers and mutual suspicion.
    Okulicz-Kozaryn, A. (2011). Does religious diversity make us unhappy? Mental Health, Religion & Culture, 1-14 DOI: 10.1080/13674676.2010.550277

    Creative Commons License This article by Tom Rees was first published on Epiphenom. It is licensed under Creative Commons.

    0 0

    Prieto-Márquez, A., and M. A. Norell. 2011. Redescription of a nearly complete skull of Plateosaurus (Dinosauria: Sauropodomorpha) from the Late Triassic of Trossingen (Germany). American Museum Novitates 3727 :1-58. 

    Abstract - The nearly complete, disarticulated skull of AMNH FARB 6810, a specimen of the basal sauropodomorph Plateosaurus collected in 1925 from the Norian (Late Triassic) strata of the Knollenmergel beds of Trossingen (Germany), is redescribed. This study supports referral of AMNH FARB 6810 to P. erlenbergiensis on the basis of the following characters: occipital condyle above level of parasphenoid; basisphenoid with transverse, subvertical lamina extending between basipterygoid processes, with ventrally projecting median process; and peglike process projecting medially from the middle of the palatine. Furthermore, P. longiceps is regarded a junior synonym of P. erlenbergiensis because the type specimen of the latter is diagnostic (displaying the above-noted apomorphies of the braincase and palatine) and, chronologically, P. erlenbergiensis has priority over P. longiceps.

    0 0
  • 11/21/11--22:47: Tetragraptines
  • Colonies of Tetragraptus quadribrachiatus, from the University of Oslo.

    In preparation for this post, I have been attempting to develop an understanding of graptolite branching patterns. This is not something that should be attempted lightly, if at all. If anything in this post seems confused, it's because it is.

    The Tetragraptinae were a group of graptolites that lived during the Lower Ordovician, and formed part of the early radiation of planktonic graptoloids. In one of the earlier phylogenetic (or at least quasi-phylogenetic) classifications of graptolites, that of Fortey & Cooper (1986), the tetragraptines (including the genera Tetragraptus and Pseudophyllograptus) were recognised on the basis of what was called the 'Tetragraptus serra proximal type'. In an earlier post, I explained how graptolite colonies grew as a series of branching zooids (individuals). The colony section for each individual zooid is called the theca, and graptolite workers usually refer to the thecae in discussions rather than the zooids (as the zooids are generally not preserved in fossils). A developing colony starts with the initial larval zooid, called the sicula. Out of the side of the sicula grows the first mature theca, which is referred to as th11 (the sicula is not included in the thecal count because it has a different growth pattern from the sequential thecae). The second theca, th12, then buds off from th11. The third theca to arise is th21, then th22, then th31, and so on and so forth. If all these bud in a simple sequence, the colony is not branching. However, if one or more of these basal thecae is what is known as a dicalycal theca (it produces two daughter thecae instead of just one), the colony branches. In most tetragraptines, th12 is a dicalycal theca, as are its two daughter thecae, so the mature colony has four branches. The basal canals of th12 and th21 crossing over the sicula, plus the proximal part of th22, make the lower part of the proximal region very robust: this massiveness is what characterises the Tetragraptus serra proximal type. Other characters listed by Fortey & Cooper (1986) as synapomorphies for the Tetragraptinae, reclined colony branches and a reduction in the number of branches, were also found in other lineages.

    Proximal region of Tetragraptus bigsbyi, showing robust morphology, together with diagrammatic representation of thecal connections in early colony. From Bulman (1970).

    The Tetragraptinae were one of a number of groups of Ordovician graptolites with four-branched colonies, though other taxa lacked the T. serra proximal region. In a phylogenetic analysis of graptoloids, Maletz et al. (2005) identified four-branched graptoloids as a single clade that they called the Tetragrapta. This is in contrast to Fortey & Cooper (1986), who placed these taxa at a number of places in the graptoloid tree. The analysis of Maletz et al. (2005) differed from that of Fortey & Cooper (1986) in being a computational analysis rather than being constructed 'by hand'. Some characters given high weight by Fortey & Cooper (1986), such as the presence of a structure called a virgella, were found to be less significant by Maletz et al. (2005). However, in some regards the coverage of the latter study was less complete than the earlier. Most notable for the present post is that Fortey & Cooper (1986) had also included 'Dichograptus' solidus in the Tetragraptinae. This species apparently also has the T. serra proximal region, but also has more than four branches in the colony. It is possible that its inclusion in a computational analysis would weaken the association of four-branched graptoloids as a clade.

    By the end of the Ordovician, the graptoloid lineages with multi-branched colonies were extinct. There have been numerous suggestions for why this may have happened—buoyancy issues or competition between zooids are among the front runners—but for the rest of graptoloid history, simplicity would become the watchword.


    Bulman, O. M. B. 1970. Graptolithina with sections of Enteropneusta and Pterobranchia. In Treatise on Invertebrate Paleontology Part V 2nd ed. (C. Teichert, ed.) pp. V1-V149. The Geological Society of America, Inc.: Boulder (Colorado), and the University of Kansas: Lawrence (Kansas).

    Fortey, R. A., & R. A. Cooper. 1986. A phylogenetic classification of the graptoloids. Palaeontology 29: 631-654.

    Maletz, J., J. Carlucci & C. E. Mitchell. 2009. Graptoloid cladistics, taxonomy and phylogeny. Bulletin of Geosciences 84 (1): 7-19.

    0 0

    ResearchBlogging.orgWhy are two breeds of dogs who can't mate without human assistance the same species, while two fish species, which can and do have fertile offspring, but which are intermediate in size and therefore not as good at obtaining resources as the parents, are different species?

    The dog example is pre-zygotic isolation, and would seem prohibitive, if not for human assistance. The fish example is called "extrinsic post-zygotic isolation". So, we have that we consider populations who can't actually interbreed the same species, but those who really do mate are different species.

    Personally, I can go either way (but do have a preference), but I really wish we could all agree to apply the Biological Species Concept a little more rigorously. My point is always that the BSC doesn't always work (as with (mostly) asexual species, such as bacteria), and other definitions should then be used. My view is that if two populations are different species by any of a set of good species definitions, then they should be called different species. This is an all-encompassing view of what species are.

    Don't lose track of the fact that what we are trying to do when we designate something as species is to say something about biology. At the end of the day, species is a term that must say something about the clustering of genomes, and remember that it is possible to cluster a continuum.

    Other good species definitions include the Ecological Species Concept, which classifies species as a set of organisms adapted to a particular set of resources, called a niche, in the environment. This definition* is more difficult to test for in natural populations, but that is neither here nor there when we talk about these basic theoretical questions.

    And do note that here I am not even talking about the appropriateness of applying one definition when it doesn't match the actual process by which speciation occurred. Two populations may diverge and become different ecological species despite continuous interbreeding, and only after many generations become reproductively isolated (as in not able to have fertile offspring, for whatever reason, save physical isolation). Thus, saying that there are only different species many generations hence when some mutation occurs that changes the ability of sperm to enter the egg, say, makes no sense in the light of the adaptive processes that made the two groups different.

    Let's call two populations different "reproductive" species or "ecological" species when the BSC or the ESC applies, and let's for Heaven's sake be rigorous when applying them!

    P.S. If you want to have a crack at this, please don't think you can resolve this by putting meaning into the use of the words "breeds" and "species" in the examples above, 'kay?

    * Oh why oh why must we call the definitions "concepts"? A "species" is a concept. Tsk, tsk, Mayr.


    Rice, W., & Hostert, E. (1993). Laboratory Experiments on Speciation: What Have We Learned in 40 Years? Evolution, 47 (6) DOI: 10.2307/2410209

    0 0

    As well as growing GFAJ-1 and making DNA I've been doing competence assays on Haemophilus influenzae strains.  This is old-fashioned microbiology, and I seem to be the lab wizard at these assays (able to do them faster and more reproducibly than anyone else).

    The big task is characterizing the starvation-induced competence responses of a number of 'unmarked' knockout mutants the RA has made.  She's closing in on her goal of knocking out every gene associated with competence.  She's actually made the deletion mutants of all of them, but she hasn't yet succeeded in removing the antibiotic resistance cassette (the 'marker') from eight of them.  (This is necessary to eliminate possible confounding effects on downstream genes.)  Here's a summary figure showing all the competence genes in the CRP-S regulon:

    She recently gave me a list of 12 mutants to test, preferably three replicates of each.  I've now done two replicates of 6 of them (plus control wildtype cells), with the expected result that none could be transformed at all.  If these results are concordant with earlier results using the marked mutants I don't think we need a third replicate.  The knocked-out genes are pilA (the major type 4 pilin), pilC (pilus assembly protein), comC (pilO homolog, pilus assembly), comE (pilQ homolog; secretin pore), comN (pulG homolog, probable minor pilin), and comP (probable minor pilin).

    On Saturday I did the other 6 mutants, but I overreached myself, trying to do them along with another experiment (see below).  I could tell at the time that it was hard to keep everything straight, and the results bear this out.  Some plates that should have lots of colonies have none (I suspect that these were old plates that I forgot to supplement with fresh hemin), and even some of the replicate plates differ by ten-fold or more.  The results for some of the strains are about what I expected, but I really don't trust any of the numbers and I think I need to do the whole lot at least twice more.  These knocked out genes are HI0659 and HI0660 (cytoplasmic proteins with no known function), pilF2 (outer membrane, required for pilus assembly), dprA (cytoplasmic, protects DNA from degradation), HI1631 (location unknown; no known function), and comJ (cytoplasmic, no known function).  All but comJ are competence-induced genes in the CRP-S regulon.

    The other experiment I was doing on Saturday was a time course of competence development by three strains.  I'll discuss this in the next post.

    0 0
  • 11/22/11--06:19: Nocturnal orchid
  • Nocturnal flowering is rather uncommon even in the tropics. Quite a few flowers open in the evening, for example, nutmegs, but the flowers remain open through the next day at least. And of course there are several bat and moth pollinated flowers. Several palms flower at night attracting hordes of beetles. But a completely nocturnal orchid is news because of all the crazy things orchids do, nocturnal flowering was not one of them. Since this orchid (Bulbophyllum nocturnum) flowered in captivity no one is certain about the pollinators, but some type of small insect certainly, perhaps a small dipteran. The dangly little filamentous appendages are strange, and they may either mimic some sort of insect or help disperse a particular floral odor, or both. In general the flowers of orchids in this genus are pretty small, ca. 1 cm diameter. Unfortunately and particularly with the fadish approach to science funding in the USA, even the small amounts of support needed to fund tropical field work are difficult to find, which makes such discoveries really difficult.

    0 0

    One of the projects the post-doc has developed is to identify the genetic differences responsible for the very low competence of a clinical strain of H. influenzae.  This strain (86-028NP, 'NP' for short) transforms 100-1000-fold less efficiently than the standard lab strain Rd after the standard competence-inducing treatments.  To identify the genes responsible for this difference he transformed competent Rd cells with DNA from NP, and screened the recombinant cells for ones whose induced competence level had decreased (he had lots of help from a very diligent and competent undergraduate).  (The strains they screened were the same strains whose genomes he sequenced to map their recombination tracts.)  The undergraduate identified one strain whose competence was about tenfold lower than Rd's.

    Now I just want to confirm that the recombinant strain does have an intermediate phenotype, and, more generally, to carefully check the phenotypes of both parents and the recombinant, under all of the conditions we know that affect competence.  So I want to check competence at all stages of culture growth in rich medium (a detailed time course) and after transfer to the starvation medium MIV.

    Below are the results of a simple time-course experiment using cells growing in rich medium.  It's not a detailed time course since I only did 4 time points, so I plotted transformation frequency as a function of cell density rather than of time, but it clearly shows that the recombinant has an intermediate phenotype.

    Next I'll do a better time course and MIV-induced competence.  I should also test induction of competence by transient shift to anaerobic culture, and by addition of cAMP to log-phase cells.

(Page 1) | 2 | 3 | .... | 198 | newer