by Samantha Jones
As we enter the last few weeks of November, moustaches are growing thick and recreational cannabis use has been legal in Canada for a little over a month. With the new legalization of marijuana across Canada, the government has been working hard to make sure that citizens know their risks of using cannabis. In Ontario, marijuana is available for purchase online at the Ontario Cannabis Store (OCS) website. In addition to selling cannabis, the website also provides information about the anatomy of cannabis, how it works, and how to lower one’s risk for health issues associated with cannabis use. Although mental health risks are some of the most widely discussed problems associated with cannabis use, there are also concerns with physical health repercussions, with several studies finding a possible link between cannabis use and different types of cancers. Given that it is Movember (an event that aims to raise awareness to men’s health issues) and cannabis has been recently legalized in Canada, I thought this would be a perfect time to discuss recent research findings of a possible association between cannabis use and testicular cancer.
Testicular cancer occurs when there is abnormal growth of cells in one or both testicles.The most common form of testicular cancer involves a testicular germ cell tumour, which results from abnormal germ cells (in this case, specialized cells that give rise to sperm) in the testicles Certain biological factors such as having an undescended testicle or one that did not descend normally, family history of testicular cancer, and age may influence one’s risk. Testicular cancer is most often diagnosed in young males ages 15-35 years old. Interestingly, in Canada, cannabis use is most common in young males aged 15-24 years old. Thus, it is important to acknowledge and examine whether there is a potential association between cannabis use and testicular cancer.
A meta-analysis looking at previous studies examined cannabis use factors and their possible connections with testicular cancer. Several cannabis use behaviours were found to be correlated with an increased risk for developing testicular cancer involving a germ cell tumour. It was found that current cannabis use, using cannabis at least once a week, and long-term cannabis use (i.e. over 10 years) were related to a higher risk of of developing a testicular germ cell tumor.
More recently, a longitudinal study looked at cannabis use and the incidence of a variety of forms of testicular cancer. Male participants were first recruited and studied in 1970 when they were between the ages of 18-21. Cannabis use was assessed in 1970, then incidence of testicular cancer was measured 42 years later in 2011. Similar to other studies, the results showed that testicular cancer seemed to be most often diagnosed in young men (specifically, ages 25 to 40). Researchers found that “heavy” cannabis use - defined as more than 50 times in a lifetime in 1970 – was significantly related to subsequent testicular cancer diagnosis. In other words, men who displayed “heavy” cannabis use between the ages of 18-21 had a higher risk of having testicular cancer later in life when compared to those with no or little cannabis use.
Research on the association between cannabis use and testicular cancer is limited since it is correlational, and more studies should be done to investigate this potential link. However, it is important to take note that certain factors of cannabis use are significantly related to the development of testicular cancer. In the spirit of Movember, I urge you all to know your risks and for those who may be affected to check your testicles at least once a month. More information on how to properly check yourself is available at the Movember website.
by Jenna Finley
Homecoming has come and gone for another year. An article about drinking alcohol and hangover cures would probably be beneficial to the portion of the university population that may have gone just a little too far this past weekend. However, with the new legislation that recently legalized recreational marijuana use, I thought it would be a good idea to take a more critical look at a new type of hangover that many have heard of and possibly been affected by: the “weed hangover”.
I was introduced to “weed hangovers” while watching a morning show and was immediately intrigued. When looking further into it, I found that multiple websites mentioned similar symptoms that constituted this hangover, including brain fog, dry mouth/dehydration, dry eyes, exhaustion, and dehydration. Plenty of listed possible cures and avoidance strategies included staying hydrated (both while and after smoking), avoiding overly salty foods, exercising the morning after, and sleeping – which, to be fair, all seem like great advice for general day-to-day life.
Most of the science associated with articles on this phenomenon are based on two research papers, which both have some glaring issues with their methodology. The first paper was published in the Drug and Alcohol Dependence journal and reported findings from a study that included 13 males who smoked either active marijuana cigarettes or placebo cigarettes. were given marijuana smokers who used either active or placebo marijuana. Researchers found significant differences between the active and placebo groups in a couple of what the paper calls ‘subjective effects,’ but none of them overlap with the symptoms listed on the previously mentioned websites. Additionally, the study has a very small sample size consisting of only males, took place over a single night, and the researchers themselves acknowledge that their findings are “subtle and of undetermined functional significance”. Therefore, it is overall difficult to draw strong conclusions from the paper or apply them to the concept of a hangover.
Looking now at the second article, published in the journal Pharmacology Biochemistry and Behavior, researchers examined 12 men over the course of two weekends after smoking marijuana. Despite being used as support for the existence of a ‘weed hangover’, the paper shows no conclusive proof that such a thing exists. In fact, the article itself says that “marijuana smoking was not [found to be] associated with a ‘hangover’ syndrome similar to those reported after use of alcohol or long-acting sedative-hypnotics.”
Both of these papers also seemed to employ very low doses of THC, and all of the participants were cited as being new to medium users of marijuana, which in turn narrows the applicability of these finding even further.
Other similar studies that investigate residual effects of marijuana use on mood, behaviour, and physiology have found little evidence for ‘marijuana hangovers’. However, it should be noted that the research in this area is sparse.
From the research presented, it seems that people may be applying the term “hangover” incorrectly when talking about marijuana use. That’s not to say that there is no withdrawal conditions associated with marijuana. As with most drugs, when a certain level of regular use and dependency is reached, withdrawal symptoms may appear after prolonged periods of abstaining from the drug. That being said, these withdrawal symptoms do not constitute a hangover.
Feel free to stay hydrated, sleep in, and avoid too much sodium, but that’s mostly just going to help with your general well-being. As of now, there doesn’t seem to be a “weed hangover” that these strategies are supposed to help with in the first place
by Alexandra Brooke
In recent years, GMOs have been a largely controversial topic, with many anti-GMO movements attempting to rid our food of those “pesky” GMOs. The question many are asking is: “What are GMOs?’
First, it’s important to define what we are talking about. GMO stands for Genetically Modified Organism, which includes genetically modified edible plants. GMO is often used to refer to organisms that have either had parts of their DNA removed or external DNA (often from other organisms) inserted. However, the term genetic modification can include anything from artificial selection (breeding certain plants selectively to produce larger and better tasting crops) to the modification of plant DNA (e.g. marking DNA that code for desired traits combined with selective breeding, removing and inserting certain genes into the DNA of plants). Due to this broad definition, it’s difficult to definitively say whether GMOs are beneficial or detrimental to society, but it is possible to look at the history of GMOs and where this controversy began.
The primitive origin of genetic modification began long before crops even existed – with the dog. In approximately 30,000 BCE, wild wolves were domesticated and bred for docility (and later loyalty), leading to the beginning of artificial selection by humans. Evidence found at archaeological sites show that later, as humans began to settle and plant crops, they artificially selected plants with the most nutritional value.
Although genetic modification has existed for over 32,000 years, it is not artificial selection that has caused controversy with the general public. What is being scrutinized is the direct changing or “editing” of genes in plants. Known as genetic engineering, this technique of gene splicing began in 1973 with the work of Boyer and Cohen in substituting a gene from one organism into another to produce the desired effect (in this case, antibiotic resistance in a protein). This experiment gave way to modern genetic modification of plants and animals, a field that holds a lot of mystery in the general public and many do not support. In 1992, the first modern genetically modified crop to be on the U.S. market, FLAVR SAVR Tomatoes, were introduced. These tomatoes were designed to rot more slowly so they could retain their firmness while shipping, and although the genetic engineering did not lead to the desired effect, FLAVR SAVR Tomatoes paved the way for other genetically engineered foods.
Genetic engineering with plants consists of many complex steps. First, a gene must be isolated, which includes finding the desired gene within one organism’s DNA and the target location of the gene within the DNA of a seed. Second, the DNA of the target seed must be extracted in one of many ways (though this sounds simple, it’s difficult to isolate the DNA of a seed without destroying the seed in the process). Next, the DNA is substituted in one of multiple ways (including CRISPR technology, bacterial modification, and “gene guns”, which literally DNA-coated metal into the seed). Finally, the seeds are planted. Despite the careful planning involved, not all plants will show the desired trait, in which case scientists will troubleshoot and modify the process.
The controversy surrounding genetic engineering seems to have begun in the 2000s, with the launch of the non-GMO project in 2007, which is responsible for things like the non-GMO labels seen in supermarkets. This project aims to promote non-GMO agriculture to support farmers’ authority in deciding what to grow and greater genetic diversity in agriculture, and to provide people a way to choose whether they wanted to consume GMOs.
In addition to the tests done with GMOs before they are approved for consumers, a number of studies examine the environmental and health effects of GMOs. Some of the main environmental concerns include the following: the increased use of herbicides to kill weeds when growing herbicide-resistant plants, the loss of diversity in agriculture, and the development of “superweeds” or “superviruses” in response to genetic engineering. In regards to human health, a 2013 report in Critical Review of Biotechnology reported that no “significant hazards directly connected with the use of genetically engineered crops” were found. However, many scientists maintain that continuous research needs to be done to investigate the long-term effects of GMOs.
In the midst of GMO controversy, it’s important to keep in mind why agricultural plants have been genetically engineered in the first place.The reasons are numerous, such as designing crops that produce their own pesticides, anti-allergen foods (including gluten-free wheat, and pesticide-free corn), and “golden rice” enriched with vitamin-A to address extreme vitamin-A deficiency.
Although it is helpful to know where our food comes from and how it is created, being critical of social media “buzzwords” and learning more about the scientific concepts behind food production can help us to get a better understanding of things like GMOs and reduce the fear of the unknown. In the end, it’s people’s own choices as to what they eat or don’t eat, but it’s important to make sure it’s an informed decision.
by Tania Kazi
You’ve just finished writing your last exam and are downtown enjoying the evening with your friends, but as the night goes on you start to feel tired and sick.
It’s not a fluke! The phenomenon of illnesses popping up right after hectic life events, such as exams and midterms, is often referred to as the “let-down” effect. The let-down effect is a generalization that attributes high levels of stress to the occurrence of unanticipated sicknesses and is commonly exhibited after important events like weddings, competitions, or tests. There has been research findings on various health conditions that support this theory. Using self-report assessments and diary entries of patients who had headaches that were comparable in severity, Lipton et al. (2014) found that a sudden drop in stress levels is related to the onset of a migraine. However, the direct effect of stress on physiology goes hand-in-hand with the numerous psychological and routine variations it produces, thus contributing to post-exam sickness.
For example, it is very common to see people living in the library and drinking tons of coffee in order to stay awake during examination periods. This results in a great deal of sleep deprivation among students and manages to further increase their tension and stress levels as exams approach, giving rise to the let-down effect. Furthermore, sleep deprivation has been long known to lower immunity, and a 2011 study has actually shown that the immune system’s activity levels generally peak during nocturnal sleep, otherwise known as sleep that occurs while it is dark outside. Many other researchers have also revealed that nocturnal sleep helps promote immunological memory, which refers to the immune system’s ability to keep recollections of previous invaders the body has faced in order to make fighting them off easier if they ever reappear in the future. Therefore, a lack of sleep and a disrupted sleep schedule can drastically alter your body’s natural cycle, leaving you more susceptible to viruses and likely to feel down in the dumps at the end of the semester.
Moreover, stress-induced eating habits also tend to flare up during exam season which results in poor nutrition and overeating amongst students. A bad diet is typically linked to many major diseases in North America, and also results in low-energy levels and digestion problems in individuals. Specifically, poor nutrition has been related to the development of negative changes in the human gut’s natural environment; such as decreased biodiversity (types of bacteria) leading to a loss in some gut functions. This lowered biodiversity may, in turn, trigger the gut’s microorganisms to stray from their standard functions, activities, and population numbers in order to adapt to the new living conditions. Unfortunately, the consequences could include nutrient absorption issues in the digestive system and the increase of numerous deficiencies of biological necessities like vitamins and minerals. Additionally, a hormone called Leptin, associated with appetite levels in relation to body fat percentages, is severely decreased with the change in the gut’s bacterial environment. This change can result in a decrease in Leptin’s activation of macrophages in the immune system, which suggests that the system’s ability to destroy invaders maybe heavily impacted and the body becomes even more susceptible to illnesses. Overall, these findings show that bad nutrition can affect the immune system in ways that seem to increase the likelihood of a post-exam sickness materializing out of thin air.
However, there is still hope for this exam season, as you can decrease your chances of getting sick by getting enough sleeping every night, avoiding midnight McDonald’s runs, and minimizing stress levels as we head into exams. Only then will you be able to avoid falling victim to the notorious post-exam bug!
*If you are too overwhelmed by exams and would like to talk to someone or need to see a physician, there are resources on campus that can help.
Queen's Counselling Services
AMS Peer Support Centre
Queen's Health Services
by Greg Eriksen
What is Artificial Intelligence?
Have you ever wondered how self-driving cars work or how Google Home replies to you as if it were human? The answer is artificial intelligence (AI). AI explains the ability for a machine to process information and simulate cognitive functions in response. Through mechanisms such as neural networks and deep learning, a machine can be trained to learn through experience. Consequently, AI machines are able to adjust to new inputs and come up with an appropriate response or action.This ability to accommodate new information helps to explain why self-driving cars don’t shoot off to the side with every bend in the road!
What is a Neural Network?
One of the primary systems artificial intelligence machines use are called neural networks. A neural network is a computer system that is inspired by the biological inner workings of the human brain. These networks are composed by a series of layers starting at what is known as the input layer. For each input, the system can dynamically analyze the data, which subsequently fires down different pathways of the neural network. The input continuously passes down the layers of the network, with each layer becoming more detailed. Through this filtering process, the system eventually is able to recognize the input and produces an appropriate output or response.
What does AI have to do with medicine?
Other than being able to sleep while a self-driving car takes you to work, there are plenty of other potential applications for artificial intelligence that are likely more useful! A major area that AI can improve is the analysis of big data, especially in the context of medicine. Big data is data so voluminous that traditional processing mechanisms cannot efficiently analyse it. Using AI technology, big data analytics can examine large amounts of data to uncover hidden patterns and insights in an extremely time efficient manner. In medicine, there are many fields that contain large amounts of data stored deep within online files. These include the millions of images in radiology to the millions of chemical compounds created in drug analysis.
With the ever-growing database in medicine, AI machines can access this and run problem solving algorithms in a process that would take humans much longer. In recent development, an artificial intelligence company known as Atomwise has developed a supercomputer which is able to analyse big data more efficiently than ever before. The supercomputer known as AtomNet is particularly used for pharmaceutical analytics. AtomNet is the first structure-based AI system that can predict the biological activity of small molecules used for drug discovery applications. Consequently, the company is able to analyze millions of theoretical molecules without wasting any materials. One of Atomwise’s biggest discoveries came from their research on the Ebola virus. Once the structure of the Ebola virus was found, it was designed on the AtomNet supercomputer. Following this, millions of simulations took place analyzing the different effects of molecules on the virus. In what would have taken traditional analytical processes months, the AtomNet AI system found two potential Ebola fighting treatments in less than one day!
The fundamental principle in biology that AtomNet exploits is that structure is largely associated with function. Therefore, the ability to determine where chemical bonding can take place is essential for the discovery of new drugs, and so AtomNet uses a convolutional neural network system that incorporates structural information in its analysis. By doing so, the system can assess how different molecular structures chemically fit together. In similar research to the Ebola virus, AtomNet investigated almost 82 million molecules and eventually discovered a protein-protein inhibitor for a treatment of the autoimmune disease multiple sclerosis!
Pharmaceutical analytics is not the only medical field that AI can improve. An AI platform known as Arterys has been developed to assist radiologists in analyzing various medical images! Furthermore, another company known as 3Scan has created a system to efficiently analyze tissue pathology. Perhaps the most exciting partnership with AI technology is with the gene-editing CRISPR CAS-9 system. In short, this system is derived from a bacterial immune response against viruses. The CRISPR CAS-9 complex is able to take the genetic information of a virus and alternatively code for the destruction of that specific virus. With the new advances in genome editing, the CRISPR system can potentially synthesise any DNA molecule. One of the only barriers affecting its prosperity is the problem of off-target effects. To test these potential effects without stepping over ethical boundaries, Microsoft wants to turn to AI technology! The partnership between AI and genome editing may soon revolutionise disease prevention.
Artificial Intelligence is making a very strong case for its influence in the medical world. With its major advances in pharmaceuticals, radiology, and genome editing it is paving a very promising future. Although self-driving cars may be awesome, a disease-free world sounds a whole lot better.
by Lauren Lin
Huntington’s disease (HD) is a fatal neurodegenerative disease that has symptoms such as chorea (jerky, involuntary movements), loss of coordination, and difficulties with walking, talking, swallowing, focusing, recalling memories, and making decisions. People with HD may also experience increased anxiety, depression, aggression, and impulse control issues. As a neurodegenerative disease, the symptoms begin with subtle issues associated with the previously mentioned symptoms and become more severe over time.
HD is one of the few neurodegenerative diseases that has a clear genetic component. It was identified in 1993 that HD is caused by a mutation in the gene found on chromosome 4 that codes for the huntingtin protein (the Htt gene). Although CAG repeats are found in healthy individuals, individuals with HD have very high numbers of CAG repeats in the Htt gene, and more repeats are associated with more serious manifestations of the disease. The huntingtin gene is dominant, meaning that individuals only need one copy of it from their mother or father to have HD. Therefore, children of parents with HD have a 50% chance of inheriting the disease. Symptoms usually appear between the ages of 35 to 55, but individuals may have symptoms starting before 20 (called Juvenile HD) or in late adulthood (Late Onset HD).
The function of the huntingtin protein in healthy individuals is still unclear, but the protein seems to play a role in the function of nerve cells since huntingtin appears to interact with proteins that only exist in the brain. The mutated huntingtin gene leads to abnormal aggregates of huntingtin protein fragments in the brain called neuronal inclusions. The basal ganglia, a brain area that is involved in movement coordination, seems to be the most affected by neuronal inclusions. However, the cerebral cortex, which plays a role in cognitive processes like attention, is also vulnerable to the effects of the huntingtin protein. The symptoms related to the cerebral cortex (i.e. cognitive difficulties) show up later than motor difficulties, which are associated with effects of the abnormal huntingtin protein on the basal ganglia.
Currently, there are no drugs that can prevent or slow down the progression of Huntington's disease, but drugs are given to people with HD to help manage their symptoms. For example, some antipsychotic drugs such as haloperidol may be given to patients with HD to help with hallucinations (which sometimes individuals with HD experience), violent outbursts, and chorea. Antidepressant and anxiolytic (anti-anxiety) drugs are also sometimes given to help with the psychiatric symptoms that individuals with HD may have.
However, many researchers are investigating new possible treatments for Huntington’s Disease, and a new gene-silencing treatment has been found to have potential in treating HD. A new drug called Ionix-HTTRx is an antisense drug that contains part of a strand of synthetic oligonucleotides that selectively binds to messenger RNA (mRNA) to block translation of all huntingtin protein. The drug is injected into the fluid around the spinal cord, which is then carried to the brain in the cerebrospinal fluid.
This month, Ionix Pharmaceuticals reported their findings from a phase 1 trial that included 46 patients aged 25 to 65 from Canada and Europe. The study was 13 weeks long, during which participants were assigned randomly to be injected with one of five possible dosages of IONIS-HTTRx or a placebo. One injection was given each month, and at the end of the study, the participants who received the highest two doses of IONIS-HTTRx had about a 40% reduction in mutant huntingtin (mHTT) levels in their cerebral spinal fluid. The researchers predict that the decreases in mHTT in the cerebral spinal fluid correspond to a 55-85% reduction of mHTT levels in the brain cortex, which may lead to clinically significant results. However, more research trials need to be done with more patients and for longer periods of time to provide a better understanding of whether the drug is truly effective in reducing the levels of huntingtin protein in the brain and helping with HD symptoms. There are plans to conduct more trials beginning later in 2018 or early 2019.
Since only one research trial has been done with Ionix-HTTRx and the trial had a very small sample and lasted for a short time, researchers and healthcare professionals do not have enough evidence to support the effectiveness of the drug. That being said, the potential for a drug to decrease the amount of harmful huntingtin protein fragments in the brain brings with it a lot of optimism and hope that we may be able to better treat Huntington’s disease.
by Amy Haddlesey
Depression is a mental disorder that is estimated to affect more than 300 million people worldwide. Major Depressive Disorder (MDD) is one of the most commonly diagnosed depressive disorders. MDD is often characterized by having at least 5 of 9 symptoms specified in the Diagnostic and Statistical Manual of Mental Disorders Fifth Edition (DSM-V), with at least one of the symptoms being depressed mood or the loss of interest or pleasure. Other symptoms may include but are not limited to: sleep difficulties, fatigue or loss of energy, reduced ability to concentrate, feelings of worthlessness, and psychomotor agitation or retardation. As a highly prevalent mental illness, it is becoming increasingly important to create new, innovative ways to detect and help depression.
In a study done by Microsoft Research, researchers looked into using social media behaviour as a way to infer the behaviours related to depression, with the intention of providing an accessible framework for early detection and diagnosis. The social media platform used in this particular study was Twitter. As a setting for the expression of many significant aspects of behaviour such as a person’s thoughts, mood, activities, and socialization patterns, Twitter provides a wealth of easily accessible knowledge about a person’s emotional condition over time without being intrusive into participants’ lives. In the past, web activity patterns and online behaviour on Facebook have also been studied in relation to mental disorders. For example, researchers have examined trends in Facebook status updates that are associated with depressive symptoms.
Using crowdsourcing, the study compiled several hundred Twitter users who participated by completing a CES-D (Centre for Epidemiologic Studies Depression Scale) screening test, Beck Depression Inventory, and an additional survey aimed at gaining depression history and demographic information. A CES-D is a 20-item self-report scale used to measure depressive symptoms. After completing the questionnaires, users could opt in to share Twitter usernames if they had a public profile, under the assumption that their profile could then be mined and analyzed anonymously by computerized programs. Within the study, data or Twitter posts were collected over a yearlong duration from individuals who had given their consent and who had either been diagnosed with depression previously or who had no history of depression. For those who had depression, Twitter data from one year leading up to their diagnosis was collected. For people who did not have depression, Twitter data was collected for the duration of a year ending with the date the survey was completed.
In total, 476 users were used within the study. Data from individuals who had depression were used to create a gold standard for the changes in activity on Twitter preceding diagnosis. The behavioural patterns on Twitter of individuals who had depression included:
· Posting patterns shifting towards later at night
· Decrease in engagement
· Higher expression of negative affect
· Lower activation
· Higher presence of first-person pronouns
· Higher use of depression terms
· Higher use of words associated with symptoms (Figure 1)
· Higher disclosure of feelings and seeking social support
· Larger discussion of therapy and treatment
Overall, there is a marked increase in certain behaviours and decrease in others before diagnosis that suggests that there may be a shift in behaviour on Twitter leading up to the onset of a depressive episode. By developing this gold standard using data from the depressed group, the study was able to build a statistical classifier that provides an estimate of the risk of depression. In making this prediction, the trend of behavioural change and the degree of the behavioural change the one-year period were both important in identifying behaviour related to depressive symptoms.
The study reported that out of their developed models, their best performing model has a 70% accuracy rating for prediction as well as a precision of 0.74. Therefore, the model seems to be able to estimate the risk of depression at an above-chance level, and the model is able to make these estimates relatively consistently. The hopes for this study and moving forward within this area is that the prediction process could aid in identifying behaviour associated with depressive episodes, increase early detection, and lead to having a better support system in place by the time a mental health-related issue presents.
Figure 1. Categorization of words into those related to symptoms, disclosure, and treatment.
by Jenna Finley
We’ve all heard someone say at some point: “I think I’m getting sick so I’ll just take a ton of vitamin C and it’ll be fine.” The idea that a dose of vitamin C will keep you from getting the flu (or at least stop the illness from lingering) is one of the most common home remedies nowadays, and the reason vitamin C tablets fly off the shelves right around the beginning of cold and flu season. However, does this home remedy actually help?
Short answer: we’re not sure.
Vitamin C began to be thought of as an important guardian of health in the 1970s when prominent doctors began recommending daily doses as a way for people to lead longer, healthier lives. But it wasn’t until the 1990s that vitamin C began to be more widely touted as a common cold prevention method. Drugs containing vitamin C began popping up on shelves claiming to be common cold cures, the most prominent of which was called Airborne. Since its release, the drug has been the subject of multiple lawsuits over the unsubstantiated claims made involving the “cold busting” power of vitamin C and yet has still inspired dozens of new ‘cold preventing’ vitamin C supplements.
As far as research goes, very little support has been found for the idea that taking vitamin C will help prevent an illness, at least for the general public. If you’re an extremely active person who takes a dose of 250-1000mg of vitamin C every single day, then you could reduce your cold incidence by half! Great news for Olympic athletes and marathon runners, but for the rest of us, washing our hands regularly would be more helpful.
The possibility of shortening the length of a cold and reducing its symptoms is where the research gets more interesting, though not in the way you might expect. The research findings are also a lot more conflicted in this area. While some studies suggest that vitamin C can reduce symptoms as much as 85%, others say the supplementation makes no difference. The most popular and cited study says that vitamin C can make a difference, but only if a 200mg vitamin C supplement is taken every single day - not just the days you’re feeling sick or the days leading up to a cold. I don’t think a lot of us can say that we meet that condition, but even if we did, the benefits aren’t too exciting. On average, this regime leads to only one less day of illness.
Taking a massive amount of vitamin C at once (megadosing) is another common method people use in the hopes that they’ll finally be free of that persistent cold. While some research seems to agree with this treatment, there is yet another caveat. The dose necessary to have a chance at relieving your illness would need to be as high as 8000mg/day, which can cause a whole host of problems. In the end, the 1000mg tablets your roommate is eating like candy around exam season might be doing them more harm than good, as too much vitamin C can make you a lot sicker, resulting in symptoms like vomiting, abdominal pain, and diarrhea.
Therefore, we can’t conclusively say that vitamin C supplementation helps, but we do know that it can hurt. Most nutritionists recommend getting your daily needed vitamin C from your meals and forgoing a supplement all together. Doses over 400mg are excreted from the body and can result in you (literally) flushing your money down the toilet.
At the end of the day, if you’re still convinced a few vitamin C tablets will help you stave off the dreaded common cold for another day, go ahead and take them, but be careful. No one wants to suffer any more than they have to during exam season.
by Rosalin Dubois
It is one of the worst times of the year - everyone, from your best friend to your professor, is getting sick. Sooner or later you probably will too, and when that time comes you’ll be struck by the same distressing experience: no matter how well you feel during the day, by the time you are ready to go to bed, you feel so miserable that you never want to leave your room again. I’ve always been told, “You’ll feel better in the morning; colds always feel worse at night!” But why is that true?
There are a few plausible explanations for this phenomenon. Some of these explanations are less scientific than others but may still be able to provide insight into why sleeping while sick can be so difficult. Consider everything that you have to distract you during the day. While you are focused on getting to class, meeting an important deadline, or even just socializing with your friends, you may pay less attention to the signals that your body is sending you that indicate you are sick. However, when you try to sleep, you have fewer distractions, and so you may notice more or these signals and feel much sicker. Additionally, when you lie down, gravity affects your body differently than when you are standing. Even just sitting up may help to clear your stuffy airways and help you to sleep better (essential for that 8:30 lecture!).
However, what if you are working late and sitting up, and still feel worse than during the day? You may be experiencing this because of the circadian rhythms of our immune system. Circadian rhythms are physical, mental, and behaviour changes that follow a daily cycle (think your “internal clock”) that tell you what time to get up in the morning. Our immune system also follows a similar pattern as researchers have found that immune response varies throughout the day. During the day, the part of the immune system called cell-mediated immunity (or just cellular immunity) is responsible for defending us from infection. This form of defence is very effective against viruses, bacteria, fungi, and other invaders. Most important to note is that we don’t typically feel the strain of this type of immunity at work.
At night, inflammation replaces cellular immunity. Inflammation is the type of immune response that we normally experience when our tissues are damaged by trauma, bacteria, heat, or other causes. When this happens, the cells that have been damaged release chemicals (including histamine and prostaglandins), which cause blood vessels to expand and allow more blood to reach the damaged area. Additionally, inflammatory mediators increase the permeability of blood vessels to defence cells that carry fluid into the tissue and cause swelling. By surrounding the damaging substance with a barrier of this released fluid, inflammation aims to isolate the invader from our tissues, and therefore prevent it from doing more damage. We experience the worst of our symptoms of being sick, like fever, increased amounts of mucus, and fatigue, when we feel the effects of swelling when our system is inflamed.
Studies over the last decade indicate that this transition between cellular immunity and inflammation occurs due to a change in the activity of a type of white blood cells called T-cells. These cells are important in cell-mediated immunity because they attack and kill antigens (foreign substances). It was determined that T-cells actually become less active against antigens during times when the body would normally be resting, especially at night.
This seems bizarre; why would the body turn off such important defenders when we need them most? A study done by a team of German researchers just last year may hold the answer to that question. In this study, researchers observed changes in the population of the lymph nodes of mice during their active times, and during their rest times. They correctly expected to see more T-cells present in the lymph nodes when they were not working and the mice were resting. However, they were surprised to find that high levels of dendritic cells were also present at this time. Dendritic cells process information about antigens and communicate this information to T-cells so that cellular immunity can effectively target this threat.
This research seems to indicate that during the day, T-cells and dendritic cells move normally throughout the body, gathering information and dealing with threats. When T-cells randomly come into contact with dendritic cells throughout the body, they receive information so that they can adapt their immune response to better eliminate the antigens. At night, both types of cell move to the lymph nodes, and the high concentration of these cells allows for a greater likelihood of interaction. Through this interaction, the T-cells will receive the information from the dendritic cells to develop a functional immune response to this threat - meaning you could potentially heal faster! Think of it as these cells meeting up to share information on any invaders to be more effective at fighting them off! Meanwhile, the inflammatory part of immunity does its best to prevent any infection from progressing further.
At the end of the day (pun intended), it seems likely that it is a combination of these factors (distraction, position, and dynamics of the immune system) that cause us to feel worse at night. Unfortunately, there still isn’t much to do to prevent this phenomenon other than what you should already be doing to treat a cold. Get well soon, Queen’s!
by Lauren Lin
“Locked-in syndrome” is used to describe a medical condition in which there is complete paralysis of all voluntary muscles in the body including most facial muscles. Individuals who have locked-in syndrome are conscious, have cognitive function, and are aware of their environment, but they cannot produce movements or speak. This condition is often caused by damage to the pons, a part of the brainstem that relays information to different parts of the brain. The damage can result from strokes, infections of the brain, or bleeding. Certain disorders like Amyotrophic Lateral Sclerosis (ALS), a motor neuron disease, can also cause total motor paralysis. Many people with locked-in syndrome can communicate through moving their eyes and/or blinking, but individuals with locked-in syndrome may eventually lose their ability to move their eyes, and so communication becomes extremely difficult.
Chaudhary, Xia, Silvoni, Cohen, and Birbaumer (2017) report on the potential for brain-computer interface (BCI) to offer a way for patients with ALS who are paralyzed to communicate. BCI research may involve invasive procedures like implanting electrodes in the brain or noninvasive technologies like functional magnetic resonance imaging (fMRI) and functional near-infrared spectroscopy (fNIRS) to record brain activity. The recorded brain activity then can be interpreted for what the user is communicating. Chaudhary et al. (2017) used fNIRS to measure the changes in blood flow by assessing oxygenated hemoglobin (O2Hb) and used electroencephalography (EEG) to measure brain waves of four patients who had no motor movement. The relative changes in oxygenated hemoglobin when patients responded to “true/yes” and “false/no” statements were significantly different from each other, and so fNIRS measurements were used to recognize whether the patient answered "yes" or "no". However, EEG measurements were not able to reliably discriminate between yes or no answers.
To train the patients to be able to answer questions using BCI, the researchers asked the patients to respond “yes” or “no” to personal statements with known answers like “Your husband’s name is Joachim” or “You were born in Berlin.” For each known statement with a clear “yes” answer, a similar statement with a clear “no” answer was also given. For example, if the statement “You were born in Berlin” was true, it could be paired with “You were born in Paris,” a false statement. The reverse was done for statements with clear “no” answers. The patients were explicitly told to think of “yes” or “no” answers but not to imagine the answer visually or auditorily so that the BCI would only be picking up on signals that correspond with “yes” or “no” sentiments rather than the look or sound of the words. The patients also received feedback on what their answer was interpreted as (e.g. “Your answer was recognized as ‘yes’”) during training.
The patients were asked a total of at least 200 sentences with known answers and 40 open questions or statements that asked about the person’s quality of life or questions of caretakers that only the patient could answer (e.g. “You have back pain.”). The four patients communicated using BCI with a correct response rate of 70% over the course of several weeks, which is above the level of chance (50%). Three of the four patients were asked open questions about their quality of life, such as “Are you happy?” and “I love to live.” These questions were asked repeatedly to ensure validity of the response. All three patients answered “yes,” which indicated an overall positive attitude towards their current situation and towards life.
BCI seems like a promising way for patients with paralysis in almost all voluntary muscles to communicate since it does not require any motor movements. However, the interpretations of the responses are not always correct, and so it is extremely important to take precautions like asking a single question multiple times. Additionally, BCI may not be accessible in all healthcare settings since it requires both the equipment needed to measure and interpret brain signals and the training for the patient to use it. Despite these limitations, BCI still has a lot of potential to provide a way for locked-in patients to convey their thoughts who may not have been able to previously, especially when considering that one of the patients included in the study had not been able to communicate for four years. The researchers of this study are hopeful that this technology could be a stepping stone towards improving the quality of life of patients who are in a locked-in state and even write that family members all “experienced substantial relief” when they were able to communicate with the patients and that they still use the system.