Change starts with communication: A look into species modelling

As we see rising global temperatures, shrinking glaciers, increased species extinction, and so much more, research in environmental conservation in the 21st century has become more important than ever. Dr. Nathalie Isabelle Chardon and her colleague’s research in Greenland, published on March 23, 2022, emphasizes the need for more attention to data collection for species prediction.

Predicting Species Distribution & Abundance

A vital part of environmental conservation is tracking and predicting where species will be as they are subject to changes in the environment. These predictions are generated by examining where species are located, in what amounts, the environmental conditions they are found in, as well as the interactions they have with other species in the area. Such models are called species distribution (SDM) and abundance (SAM) models.

Frame quadrant data collection in Greenland Source: Nathalie Isabelle Chardon

Based on these models, researchers can then predict whether or not a species would be expected to be found in another area with similar conditions. The ability to make predictions on species distribution and abundance is crucial in understanding how these species will be affected by things like human interference and climate change. Scientists can then use this information to determine what species are in need of conversation efforts and to what degree of urgency.

New Research in SDM/SAM Modeling

Models such as the SDM and SAM have existed and have been used by researchers to study conservation biogeography 2013. However, a big part of science is that it is constantly evolving, changing, and adapting to new information uncovered by scientists, and such change is seen in SDM and SAM models as well. A recent study conducted by Dr.Chardon found that while common models using only one location to collect data provide good model performance, that does not necessarily guarantee realistic results. The study found that although models that use multiple locations for data collection had an overall weaker model performance, the generated results were more realistic and applicable. Additionally, Dr.Chardon’s research also indicated that including interactions between species as a variable of these models can greatly improve the model’s performance. 

Importance of Science Communication in Environmental Research

The health of our planet and biodiversity is vital to our survival as human beings that rely solely on the environment for resources. Not only does this make research in environmental conservation important, but public awareness of this issue is also just as important. Effective public awareness comes down to effective science communication between scientists and the public. To this day, we continue to see a denial of the idea that climate change is real. Such conflict in climate change opinions often stems from mistrust and miscommunication between scientists and the audience, as well as personal beliefs regarding the topic. Thus, emphasizing the importance for scientists to share knowledge on how conclusions to scientific statements are made, as well as for the public to be open to new findings. 

Dr.Chardon and Colleague during their research in Greenland Source: Dr.Nathalie Isabelle Chardon

Written by: Group 2 – Irene Choi, Joanne Kit, Jonathan Hao, & Sicong He

Membrane Trafficking and Relation to Disease

It is often difficult to dissect the causes of diseases like ALS, Parkinson’s, and Huntington’s at the cellular level. We do know that a malfunction of certain cellular processes causes these diseases. However, there are so many cellular tasks that it is complicated to pinpoint an exact cause. Luckily, new research investigating membrane trafficking mechanisms gives us some insight into why these diseases occur.

What is membrane trafficking?

Think of our cells as complicated machines that operate on strict protocols. One of its functions is to transport cargo from place to place, sort of like a highway. Membrane trafficking is the process of distributing goods within a cell, into a cell, or out of a cell. These goods can range from a variety of things like proteins or macromolecules. The following is a video to explain membrane trafficking more easily:

 

How is membrane TRAFFICKING-RELATED to these diseases?

At the current moment, there isn’t a lot of research done in this particular field. We know that in brain cells, communication is done in the language of membrane trafficking. By distributing certain goods from brain cell to brain cell, they’re able to create an information network that allows our bodies to perform actions like moving our arms, smelling the air, etc. However, the disruption of membrane trafficking causes a break in this information network, which forms disease as previously mentioned. Fortunately, new research investigates how the breaks in the information network occur.

The VINE complex is a VPS9-domain GEF-containing SNX-BAR coat involved in endosomal sorting

We’ve had an amazing opportunity to interview Shawn P. Shortill, a Ph.D. candidate at BC Children’s Hospital through the medical genetics department at the University of British Columbia. He has been working on a research paper on the topic of membrane trafficking. 

Essentially, Shortill’s research used budding yeast as a model organism to study the fundamentals of membrane trafficking pathways. Budding yeast is a great research tool as its genes can be easily altered, cells are similar to human cells,  there is a lot of it, gives fast results, and is extremely cheap. The following podcast clip gives a quick insight into the specifics of Shortill’s research:

Shortill has identified a series of proteins (called VINE) located on the endosome, an important cellular component for membrane trafficking, that is essential for regulating membrane trafficking. VINE has also been speculated to be involved in ALS and Parkinson’s development.

How is this research important?

As previously mentioned, the discovery of VINE is important because of the potential to be involved in the aforementioned diseases. However, not much else can be discussed in the context of disease. Nonetheless, this is a step forward in understanding why certain diseases occur at the cellular level. The next step forward is to determine if this research can be applied to human cells and if it is truly related to the disease. 

We would like to thank Shawn for giving us the time to interview him and for him to fully educate us on his research. 

– Olivia Kochhar, Ryan To, Darryl Ma, Jimmy Huang

Studying DNA has never been easier…Thanks to eVITTA!

There’s one thing that every single one of us has in common with each other and all other living beings. It’s the fact that we carry DNA in our cells. DNA is an important molecule that has all the inherited information about how a living thing will look and function. A strand of DNA is like an extremely long sentence that uses only four letters – in fact, the length of the human DNA is 600 billion letters long! Can you imagine how hard and time consuming it would be to read? That is why scientists at The University of British Columbia (UBC) have developed a new and exciting application called easy Visualization and Inference Toolbox for Transcriptome Analysis, or eVITTA for short. eVITTA simplifies analyzing RNA (a type of DNA) and not only makes the process more efficient, but improves scientists’ understanding of its information as well.

The kind of DNA information that eVITTA works with is called the “transcriptome”. The transcriptome is the full set of RNA, which are the copies of DNA, within a cell. Analyzing the information in the transcriptome is crucial; for the past few decades, it has been one of the most used techniques for investigating diseases and their mechanisms.

The analysis of transcriptome data, however, is very tedious and time consuming. You have to retrieve the data, examine it, and then compare it with other transcriptome datasets to draw a conclusion. This involves several steps and different types of programs, which can be inefficient. eVITTA was created to combine the many steps of transcriptome analysis into one simple, user-friendly interface. This speeds up the process of transcriptome analysis immensely!

The tedious process of analyzing long strands of DNA is simplified with eVITTA.
Image Credit: CI Photos/Shutterstock.com

evitta was born out of excess data!

One of the many topics that sparked our interest was understanding the circumstances surrounding the creation of eVITTA. As Dr. Yan of UBC’s Taubert Lab puts it:

This whole project was born out of our need to pass on data. We have a lot of transcriptome data from past years that no one has gotten around to analyzing…. so during the pandemic, we started digging into those data and developing visualization modules and we realized we can actually make this into an app so that we can feed more data and generate visualizations

The team behind eVITTA

To discuss the different aspects of eVITTA and to delve deeper into this project, UBC’s Shayan Abbaszadeh sat down for a virtual interview with PhD candidate Judith Yan from the department of Cell and Developmental Biology at UBC. Dr. Yan is one of the many faces behind eVITTA at the Taubert lab and has worked tirelessly to bring this idea into fruition. Our podcast describes her role in realizing the gaps in efficient transcriptome analysis and building eVITTA with the rest of the Taubert Lab during the COVID-19 pandemic. Outside of eVITTA, Dr. Yan’s lab work in the Taubert Lab usually involves using model organisms such as roundworms to study stress responses and applying that knowledge to better understand human diseases. 

How does eVITTA make analyzing RNA so simple?

To grasp the scope of this research, first and foremost, it is important to understand transcriptome analysis. Dr. Yan describes transcriptome analysis as a powerful tool that examines how RNA, a copy of DNA, is used in a cell, tissue or organism. This involves taking out the whole RNA from a sample, getting it sequenced (i.e. decoded), and subsequently obtaining information from that data.

An exploration of gene patterns is one of the main aspects involved in effective transcriptome analysis.  A gene is a section of DNA that carries a specific piece of information. A crucial aspect of analysis is understanding that some genes are turned on at higher rates than others. According to Dr. Yan, once there is a count for these different genes, the numbers can be interpreted to reveal useful information about “what some of these genes are doing” and “what processes and gene sets are actually being changed“. Finally, there needs to be effective visualization techniques of the different data sets and the data needs to be validated against previously published data. 

All of these functions are achieved through eVITTA; a user-friendly, web-based interface that streamlines the multiple steps of transcriptome profiling. Watch this short video to become familiar with the 3 modules of eVIITA; easy-GEO, easyGSEA, and easyVizR, and realize what effective transcriptome analysis looks like!

The challenges of transcriptome analysis and how evitta addresses them

In our podcast, we focused extensively on the motivations behind the creation of eVITTA, specifically in relation to challenges associated with transcriptome analysis and how eVITTA deals with these challenges in ways that previous methods could not. Dr. Yan alludes to the use of the ‘overrepresentation approach’ (ORA) in previous technologies that has the flaw of only being able to represent a small subset of gene changes. As Dr. Yan puts it: “you’re missing out  on a lot of information and biologically… you are not able to capture the less severe changes. eVITTA on the other hand, focuses on entire sets of genes instead of just one gene, allowing scientists to observe gene changes across the board. 

Additionally, previous technologies did not allow for the organization of data because “oftentimes it is very tedious to do multiple comparison because you have many different subsections that you want to look at“. eVITTA prevents the dumping and mislabeling of gene data which expedites the process of discovering important biological patterns. This platform has already been proven to be highly effective in studies involving severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) and roundworms (C.elegans). 

Dr. Yan’s research using eVITTA involves the starvation response of C.elegans roundworms. (credit: Taubert lab)

what does the future hold for eVITTA?

Our podcast also discusses the future of eVITTA and the team’s plans for further expansion. Dr. Yan and her team plan to add more algorithms to analyze the expression of different genes at a more in-depth level. The team also want to add more visualization options to provide more choice for gene dataset analysis. Finally, they are planning to update eVITTA’s databases by adding more datasets to the application. These ideas and more are discussed in our podcast below!

As transcriptome profiling provides major insight regarding diseases and illnesses, the optimization eVITTA can provide may be increasingly vital for today’s society. For instance, if transcriptome analysis can be conducted more efficiently, crucial findings in disease mechanisms may be discovered sooner. As a result, treatments and health regulations can be created more quickly, potentially saving lives and preventing disease spread. Additionally, eVITTA’s user-friendly and web-based interface makes transcriptome profiling accessible to more biologists around the world and would therefore greatly benefit the research community.

– Heather Cathcart, Kaushali Ghosh, Parham Asli, Shayan Abbaszadeh

 

 

 

Enzymatic Browning in Granny Smith Apples

Introduction

Browning is the darkening of the flesh that occurs shortly after fruits such as apples and pears are cut, exposing the flesh to air. Although browning is not toxic to humans since the pigment is composed of melanin, browning makes the fruit unsightly and unappetizing to eat . The main cause of browning is polyphenol oxidase (PPO), an enzyme that catalyzes the first two steps of converting the amino acid tyrosine into melanin in the presence of oxygen. Physical and mechanical stresses from slicing fruit physically damage the cellular structure of the flesh, which catalyzes PPO activity as it becomes exposed to oxygen .

Why Do Apples Turn Brown After You Cut Them? | Let's Talk Sciencefrom lets Talk science

what is an effective tool to delay apple browning?

Past studies have investigated the effects of different consumable solutions on browning, and citric acid, found naturally in fruit juices, was identified to be a moderate to a high inhibitor of browning . Citric acid is able to slow the onset of browning by lowering the pH of PPO’s environment so that the pH is outside the optimal pH range of 6-7 required for PPO to oxidize the flesh . Although other agents such as chelators and antioxidants can also inhibit PPO through reduction and inhibition, citric acid is more commonly found in foods, especially fruit juices that, when applied to apples, are more likely to preserve the apples’ taste

ReaLemon 100% Lemon Juice, 15 Fl Oz Bottle, 1 Count - Walmart.comfrom Walmart

Despite citric acid being an effective tool for delaying browning, the concentration at which its effects last for an extended period of time is unknown. Therefore, lemon juice was chosen since it is known to be composed of approximately 6% citric acid . Lemon juice also contains other acids, like ascorbic acid and malic acid, but their concentrations are negligible since citric acid comprises about 95% of the acid content of lemon juice . Granny smith apples were used as they are a common fruit consumed in households and are known to brown quickly after being sliced. It was hypothesized that lemon juice will delay the onset of browning due to the citric acid making the juice’s pH too acidic for polyphenol oxidase to initiate the conversion of tyrosine into melanin. If lemon juice delays browning, then the surface area of browning that appears on apple slices over a set period of time would decrease as the concentration of lemon juice the apple is exposed to increases.

Conclusion

This result shows that  increasing lemon juice concentration results in the decrease of browning due to the low pH of the juice compared to the pH that browns apple flesh. These results provide insight for future studies to find more effective anti-browning agents and to further investigate environmental temperatures to avoid to delay browning.

—–Chenyang Luo

Reference

Son, S. M., Moon, K. D., & Lee, C. Y. (2001). Inhibitory effects of various antibrowning agents on apple slices. Food Chemistry, 73(1), 23-30

Tinello, F., & Lante, A. (2018). Recent advances in controlling polyphenol oxidase activity of fruit and vegetable products. Innovative Food Science & Emerging Technologies, 50, 73–83. https://doi.org/10.1016/j.ifset.2018.10.008 

Tortoe, C., Orchard, J., & Beezer, A. (2007). Prevention of enzymatic browning of apple cylinders using different solutions. International Journal of Food Science & Technology, 42(12), 1475-1481.

Yapo, B. M. (2009). Lemon juice improves the extractability and quality characteristics of pectin from yellow passion fruit by-product as compared with commercial citric acid extractant. Bioresource Technology, 100(12), 3147–3151. https://doi.org/10.1016/j.biortech.2009.01.039 

Breakthrough in Fusion Energy

When the term “renewable energy” comes up in conversation, we immediately think of methods like solar and hydropower. However, these modes of energy production come with pros and cons. For example, hydropower is not feasible in areas without access to moving or falling water and solar energy only works when there is available sunlight. However, one method that is up-and-coming to produce renewable energy is fusion reaction, which has the potential to be extremely versatile.

what is fusion energy?

To fully graph the idea of fusion energy, we have to understand the concept of nuclear fusion. At the most rudimentary level, nuclear fusion is the process of merging two small atomic elements together. In this process, the merging of the two atomic elements releases energy that can be harnessed. However, the actual physics and mathematics are more complicated than previously explained.

For a more concise look into nuclear fusion, refer to the following video:

The main advantages of fusion energy are that it produces little to no greenhouse gasses. The disadvantage of the fusion reaction is that currently, the energy input exceeds the energy output and there is the production of small amounts of radioactive waste.

what is the future of nuclear fusion looking like?

As of February 9th, 2022, engineers at the UK Atomic Energy Authority’s Joint European Torus (JET) facility in Oxford produced 59 megajoules of sustained fusion energy, which breaks the previous record of 22 megajoules set in 1997. This reaction lasted for five seconds before breaking down. However, the energy input to start the reaction exceeded the amount of energy produced. Nonetheless, this experiment displayed that the fusion reaction can be sustained for a good amount of time and is a great stepping stone for a future experiment set in 2026 called ITER.

A Look into Fusion Reactor – Image from Wikipedia

what is iter?

ITER is a nuclear fusion plant currently being built in southern France and is set to begin experiments in 2025. The goal of ITER is to produce 500 megawatts of energy in 400-second intervals. For some context, if this facility was to use one ton of deuterium, one of the two small elements used for fusion reaction, that would be the equivalent of using 29 billion tons of coal to produce energy. If experimentation is a success, then there is a great possibility for a further transition away from fossil fuel usage.

ITER fusion reactor – Image from ITER

overall takeaways

From recent news, there is potential for fusion energy to be a mainstream source of renewable energy. There is optimism that ITER will be a success from the JET experiment. However, ITER experimentation begins in 2025 and results can differ from what is predicted. For now, we will still have to rely on current methods of renewable energy production.

– Jimmy Huang

Will Artificial Intelligence Save Humanity Or End It?

Artificial Intelligence, or AI for short, is the broad idea that machines can make decisions and perform tasks without being explicitly programmed to. AI uses machine learning, deep learning, and other techniques to learn how to solve problems beyond its understanding just like a human child. In some ways, AI is very similar to the neurons in the human brain, because it relies heavily on neural networks which are huge interconnected algorithms that process information by responding to different inputs. Since AI is a very new topic in our era, there are many different perspectives on its impact on humanity. Some consider AI a potential threat to humanity that will one day dominate the Earth, while others believe that AI is the savior that will solve most of humanity’s problems.

    A drawing of a small neural network. A neural network starts with input or multiple         inputs and processes that information until it releases the output or the response.                               Image taken from a public domain.                         https://www.shutterstock.com/image-vector/neural-network-graphic-scheme-artificial-intelligence-1583864422

How dangerous is AI?

Many people who dislike AI claim that AI’s knowledge is superior and beyond the human level. Just like a calculator, AI is very good at taking numbers and multiplying or adding them. However, AI itself isn’t too smart, as it lacks the ability to solve problems larger than its scope of intelligence. When AI reaches a problem beyond its scope, it simply produces error messages and fails. On the other hand, when humans reach a difficult problem, they look for other ways to solve the issue. Unlike humans, AI can’t perform complex divergent thinking, at least not yet!

Over the past few years, many jobs have been replaced by automated machines and artificial intelligence. As a result, there have been many concerns about the threat of AI taking over human occupations. Fortunately, unlike many science fiction movies, AI cannot take over human jobs and permanently replace us because humans are inherently needed to create the training data AI is created based on. Additionally, unless the tasks are on an enormous scale, it is much cheaper and more efficient to use humans instead of AI.

In many large companies, jobs are being replaced by AI for the long-term interest of the                   company. This graph shows the most vulnerable jobs in the US.                                                      Photo from CBINSIGHTS                                 https://www.cbinsights.com/research/jobs-automation-artificial-intelligence-risk/

Is AI perfect?

So by now, you might be thinking that AI is perfect. Not at all! AI is used around the world for many important tasks and decision-making processes. Some of these include Amazon’s hiring process, iPhone’s Siri and facial recognition, and Tesla’s self-driving cars. With the advancement of technology, many companies are trying to create their own AI system at a fast pace, which leads to many bugs going through without being fixed. In a worst-case scenario, an AI programmer’s bias could subconsciously affect the algorithms that make up the AI. These algorithmic bugs and biases can create dangerous racial and gender bias during important decisions such as hiring, loan application, and criminal justice, leading to discrimination of certain groups within the society. Such problems can only be solved by humans looking over and supervising the AI.

In Conclusion

Overall, AI is a huge development and it has helped humanity in many ways, but it is also important to acknowledge that AI will never replace humans and our decision-making. At the same time, the concept that AI will one day terminate our world is recognized within the public eye, due to a lack of understanding of how artificial intelligence and algorithms are created and used. Therefore, our goal should be to resolve the issues in artificial intelligence and work towards perfecting the algorithms, so that we can build a better future for all of us!