Below is a list of abstracts by third-year students in the Integrated Science Program:


Harrison Martin: An Investigation of Possible Scaling Relationships Between Dune- And Bar-Scale Features In A Modern Fluvial Setting

The interpretation of ancient environments through their preserved rock records can be aided by the study of modern environments as analogues. One common problem in the interpretation of fluvial paleoenvironments is the estimation of river scales and properties using stratigraphic features. While it is already known that there exist scaling relationships between unit bars and rivers, the full dimensions of bars are not generally preserved in the rock record. Dunes, however, can sometimes have their dimensions preserved in plan-view. For this reason, a scaling relationship between either the wavelength or sinuosity of duneforms in planview, and the unit bars upon which they form, would allow for the approximation of ancient unit bar scales and thus river properties using preserved dunes. To this end, modern, high-resolution remote sensing satellite data will be used to view modern river systems with exposed dunes and bars. Measurements will be taken and analysed using statistical techniques in order to investigate the possibility of a reliable scaling relationship between any of various dune or bar measurements. If successful, applications of this research could aid in the field of hydrocarbon resource exploration. Most of the world’s oil & gas reserves are located in sedimentary rocks, with many of those (including most of the Alberta oilsands) located in the modern products of ancient fluvial-deltaic systems. The discovery of new scaling relationships between dunes and bars could help in developing new interpretation methods for these paleoenvironments for both academic and industry-related purposes.

Christina Spinelli – Examining the influence of movement on neural entrainment to meter in young infants

Listening to music is a multi-sensory experience during which not only our auditory system, but also our motor system, plays a major role. While we hear by listening to the pitch of the notes, our motor system entrains to the underlying rhythmic beat, encouraging us to move with it in synchrony. Furthermore, the metrical structure of a song influences how we move to the beat. Larger movements are often made on the first beat of a musical bar, every second beat for duple meter (a march) and every third for triple meter (a waltz). While what we hear influences how we move, the reverse is also true. Behavioural research has shown that bouncing an infant to every second or third beat biases their interpretation of an ambiguous rhythmic stimulus (a pattern that can be interpreted as either duple or triple meter). Similar effects of movement on metrical perception can be measured behaviourally in adults. This effect can also be measured at the neural level by analyzing the steady-state evoked potentials in an adult’s electroencephalography (EEG) in response to rhythms. The objective of the current project is to take advantage of this new approach to determine how the brain of infants encodes meter. Seven-month-old infants will be bounced either on every second or on every third beat as they listen to an ambiguous rhythmic pattern. Then, while sitting quietly on their mother’s lap, EEG will be recorded while they listen to the ambiguous rhythm repeated for 18 minutes. We hypothesize that the way the infant is bounced will influence how the ambiguous pattern is interpreted, and that these different metrical interpretations will be related to corresponding changes in the neural entrainment to the rhythm as captured with the EEG.

Douglas Chan

No abstract provided.

Trystan Nault – GPCR Interacting Proteins (GIPs) and Their Roles in Synaptic Plasticity

G protein coupled receptors (GPCRs) are generally activated by a ligand binding to a binding pocket.  Following this initiation on the extracellular motif(s), conformational changes occur which allow signalling to intracellular G proteins and the activation of downstream pathways, leading   to any number of changes within a cell or greater tissue. GPCR interacting proteins (GIPs) have functional domains,  which allow for these downstream pathways within a cell to occur based on the initial activation/conformational change of the GPCR.  GIPs also function by trafficking GPCRs to subcellular destinations such as the cell membrane, where they are able to function as metabotropic receptors for neurotransmitters.  GIPs  play a role in synaptic plasticity,  in that they can be signalled to potentiate or depress a synapse by adding or removing GPCRs from a post-synaptic membrane. On a larger scale, GPCRs can mediate cell signalling by acting as scaffolds for recruitment of GIPs which modulate GPCR function and signal transduction.  GIPs also function by regulating the specificity of GPCR binding pockets, receptor endocytosis, expression in the cell membrane (post synapse) and receptor recycling. Collectively, these functions allow GPCRs to mediate cell signalling through their recruitment of GIPs.

A literature review including historical and current research regarding the specific roles of GIPs in classical long term potentiation (LTP) mechanisms within CA3-CA1 Hippocampal Synapses was conducted.  In addition, the roles that different GIPs play in the more general function of the neuron are discussed in order to add breadth to the topic. Along with the identification of future potential research questions to further the current understanding of classical LTP, possible methods through which these questions can be answered were discussed. Hypotheses and different possible outcomes are overviewed, along with the implications of each scenario for the present knowledge of the subject and for possible applications in drug discovery and health care.

Josanne White – The Effect of Language and Structure on Mathematical Word Problem Solving

In the 1980s and 90s, several researchers looked into the role of wording on elementary school children’s ability to solve single step addition and subtraction problems. It was found that certain ways of structuring a word problem would cause the children to form a specific mental representation of the problem. Some of these mental representations are more useful than others to use in selecting the correct problem-solving strategy, so the problems which elicit the best mental representation are much easier to interpret and solve. Research has tried to determine what the optimal structure or wording for this type of problem would be. Vicente et al. more recently found that conceptual rewording, or the addition of statements that clarify mathematical relationships, was the most effective approach.

Caroline van Every – The impacts of Bythotrephes Longimanus on the food web structure of Canadian inland lakes

Bythotrephes longimanus, commonly known as the spiny waterflea, is a predatory zooplankton species that is native to Northern Europe and Asia. It was first introduced to the Great Lakes in the 1980’s, and has since spread to many inland lakes in the surrounding region. While the impacts of Bythotrephes on zooplankton abundance and community structure have been largely investigated, less is known about the effects of Bythotrephes on species occupying higher trophic levels. Additionally, research has shown that Bythotrephes is a potential competitor with small and juvenile fish for zooplankton prey. This study investigated the impacts of Bythotrephes on the food web structure of Canadian inland lakes. Carbon and nitrogen stable isotope ratios were analyzed to compare the trophic positions of zooplankton and fish populations in lakes that either have or have not been invaded. Past research has indicated that lakes containing Bythotrephes exhibit reduced herbivorous zooplankton biomass, as well as increased proportions of omnivorous and predatory zooplankton. Therefore, it is predicted that invaded lakes will exhibit zooplankton and fish communities with elevated trophic positions. The investigation of the effects of Bythotrephes on food web structure will allow for a better understanding of its invasive impacts, which is vital in regards to food webs, especially if Bythotrephes is causing significant modifications or outcompeting small native fish species.

Daniella Pryke – Comparing the efficacy of SSRIs and cognitive behavioural therapy alone and in combination on anxiety disorders.

Approximately one in six people living in North America will experience an anxiety disorder during their lifetime. In these individuals, anxiety is persistent and severe, and causes distress in their lives. Two of the most common methods of treating anxiety disorders are selective serotonin reuptake inhibitors (SSRIs) and cognitive behavioural therapy (CBT). It is therefore important to assess how effective these treatments are, and whether they are more effective independently or when used in conjunction. In order to answer this question, I am conducting a systematic literature review. This review is being conducted on PubMed. The following keywords are being used: “cognitive therapy” AND “anxiety disorders/therapy” AND “serotonin uptake inhibitors/ therapeutic use”. From the papers that this search generated, only those in English which directly compared SSRIs and CBT were used. This created a set of 22 papers. These papers were then grouped by anxiety disorder, age and specific SSRI to see which area it would be best to focus on. I chose to look at nine papers which compared the efficacy of CBT with and without SSRIs in children or youth (ages 7-17). The first of these papers was published in 1997, and the most recent paper was published in 2013. I expect that CBT and SSRIs will be more effective when used in conjunction. I also expect that when the only comparison is between CBT alone and SSRIs alone, SSRIs will be more effective. This research will provide insight into whether SSRIs or CBT is more effective, and if they are more effective in conjunction. I am also hoping to clarify if one treatment option is better than the other when treating children with anxiety disorders.

 Melissa Ling – Transcriptional and Metabolic Changes in the Inflammatory Response

The medical advances in the past century have enabled first world populations to have greater access to health care and medicine leading to an ever increasing elderly population. The elderly are more vulnerable to infectious diseases, however, and this barrier must be tackled if the boundaries of age are to truly be pushed back. Unlike a young person, an aged individual’s immune system is less successful in mounting an effective immune response following pathogen infection. Since macrophages are important first line defenders of the innate immune system against pathogens, their performance in response to different stimuli acts as an indication of the immune response being generated. Macrophages are polarized to be M2 (“repair”) macrophages or M1 macrophages. M1 activated macrophages mount a microbicidal defense, and benefit from using pathways of glycolysis to provide the building blocks necessary for their mechanisms of pathogen destruction. Previous studies have demonstrated that metabolic changes that occur with age result in the inability to switch to the glycolytic pathway, and are subsequently responsible for impaired macrophage function in older individuals. As such, altering metabolic cycles to restore glycolytic activity is an attractive target for slowing down the effects of aging. In order to know which part of the cycles to target, however, it is first imperative to understand how and when young murine macrophages react upon activation through LPS stimulation. A targeted metabolic analysis using Gas Chromatography – Mass Spectrometry was completed and RNA sequencing results analyzed to investigate these metabolic changes. Our findings indicate the greatest metabolic changes occur 16 hours following LPS stimulation, whereupon young macrophages demonstrate a shift of their metabolism to aerobic glycolysis. At that time, genes involved in glucose metabolism were upregulated. Due to the importance of aerobic metabolism for activated macrophages, future research is needed to investigate whether altering the metabolite concentration in the macrophage growth medium results in a regained ability for aged macrophages to perform the switch to aerobic glycolysis upon infection.

Phil Lauman – Elucidating the site of TRAF-6 sequestration on SR-AI in Macrophages

On macrophages and several other types of immune and immunity-related cells, Toll-like receptors (TLRs) are responsible for pathogen recognition and the subsequent activation of cellular pathways which mediate various aspects of the immune response. In previous studies, it has been shown that cell surface activation of TLR-2 in macrophages triggers the MyD88 signalling pathway, while endosomal activation of TLR-2 in the same cells triggers the TRIF-TRAM pathway. Since the activation of these pathways is inappropriate in the presence of certain stimuli, cellular mechanisms must exist to downregulate the signalling pathways under certain conditions. Indeed, it has recently been determined that SR-AI, a scavenger receptor involved in macrophage-driven phagocytosis, may contain a motif which binds to a downstream adapter of the MyD88 pathway known as TRAF-6 and sequesters it, thus preventing a response. SR-AI normally contains two putative binding motifs known as the TRAF-2 and TRAF-6 binding motifs (T2BM & T6BM), respectively located on the cytoplasmic and extracellular domains. Although one might naturally expect the latter motif to be involved in TRAF-6 sequestration, recent studies indicate that TRAF-6 – SR-AI interaction occurs in the cytoplasm and the T2BM may therefore be a possible candidate.

In this study, we attempt to identify the SR-AI motif involved in the sequestration of TRAF-6 and thus the downregulation of the MyD88 signalling cascade. To achieve this, we use a SEAP assay to measure the level of TRIF-TRAM activation in HEK-293T cells transfected with either wild type (WT) SR-AI or SR-AI mutants with T2BM, T6BM, or T2/6BM deletions. Since TRIF-TRAM activation and concentration of free TRAF-6 are positively correlated, analysis of the SEAP assay allows us to determine which mutants are associated with lower levels of TRIF-TRAM activation, and thus identify the binding motif(s) involved in the sequestration of TRAF-6. Negative controls are used to determine baseline levels of TRIF-TRAM activation in the absence of endosomal internalization by SR-AI. Pam3Csk4, a synthetic analogue of bacterial lipopeptides, is used as the primary ligand to activate TLR-2.

We expect to find significantly higher levels of TRIF-TRAM activation in the T6BM and T2/6BM transfectants when challenged with both Pam3Csk4 and SP P1121, demonstrating that the T6BM is involved in TRAF-6 sequestration. These results will clarify the localization of TRAF-6 – SR-AI interaction, and may eventually be used to produce drugs which promote activation of the MyD88 and TRIF-TRAM pathways by blocking TRAF-6 sequestration.

Jared Valdron – Home Sweet (Materialistic) Home: The Contextual Malleability of the Implicit Association between Wealth and Happiness

Promotions and overtime are a part of professional life and often provide an important trade-off: more work in exchange for more pay. Classic economic theory assumes that people make these kinds of decisions completely rationally and independently of irrelevant factors, but a growing body of research suggests that this is not the case. The present study examined whether one’s likelihood of taking more work for more pay changes with the context in which the decision is made, through malleability in their implicit association between wealth and happiness. Specifically, it was investigated if “Work”, “Lab” and “Home” settings differentially affect decisions to take more work for more pay. In an experiment administered online, participants were first primed with either “Work”, “Lab” or “Home” settings through a writing task. Second, participants took an Implicit Association Test (IAT) assessing their association between “Wealth” and “Happiness”. Third, participants denoted their willingness to take an increase in work for an increase in pay. Finally, participants were asked how much they associated wealth and happiness on an explicit level, and subsequently completed the short Money Ethic Scale (MES). Contrary to initial predictions, participants implicitly associated “Wealth” and “Happiness” more strongly when primed with “Home” than “Work” or “Lab”. Consistent with initial predictions, participants’ IAT scores (but not explicit attitudes or MES scores) were positively correlated with their willingness to take more work for more pay. These findings can be applied to industry, where employers could encourage employees to make the final decision about taking a promotion or doing overtime while at home. These results also have theoretical implications into the nature of implicit attitudes, lending support to the views that implicit attitudes are malleable and constructed on the spot.


Jesse Bettencourt – The Arduino Platform and Science Education

Arduino is an open-source electronics prototyping platform which utilizes well-documented hardware and software to provide a rich, open, and accessible interface for user-created electronics. The platform is employed by many areas of interest, ranging from creative installations by artists to mechatronic projects by engineers. Arduino has great potential as a learning tool in the sciences. This presentation will be to introduce the platform, highlight examples of introductory Arduino projects, and discuss the relevance of Arduino to science curricula. It will feature overview the hardware, software, and the resources available to students interested in pursuing an Arduino project. Further, it will showcase an example of open hardware in undergraduate lab design.

Aaron Goldberg – Numerical Approximations of Partial Differential Equations using Finite-Difference Methods

Disciplines such as physics, chemistry, and economics are governed by descriptions of how certain properties change relative to others. These descriptions are often codified mathematically as partial differential equations (PDEs), which relate multivariable functions to one or more of their partial derivatives. The solution of a PDE quantifies how a system will behave in time, space, and/or a mixture of other variables; however, most PDEs are not solvable analytically. By approximating solutions of PDEs, scientific computing can be harnessed to model the evolution of otherwise unsolvable complex systems. These approximations have inherent imprecisions, and special care must be taken to ensure an appropriate approximation is used.

This project characterizes the use of the forward-time central-space finite-difference (FTCS) method to approximate solutions of two common PDEs: the heat equation and the wave equation. MATLAB codes were written to model the evolution of these two PDEs. Partial derivatives were approximated by finite difference equations, yielding equations for the systems’ states one time step in the future of the current states. Matrix manipulation was used to iteratively evaluate these states for arbitrary lengths of time, to analyze the behaviour of various initial conditions.

The known, exact solutions of the heat and wave equations were used to characterize the error, stability, and convergence of the approximations. For the heat equation, it was found that the FTCS method was stable when ν*Δt/Δx2 ≤ 0.5, where ν is a type of diffusion constant, Δt is the size of each temporal division, and Δx is the size of each spatial division. As well, it was found that the truncation error was of order Δt + Δx^2. For the wave equation, it was found that the FTCS method was stable when c2*Δt2/Δx2 ≤ 1, where c is a type of wave speed. The wave equation’s truncation error was of order Δt2 + Δx2. The above results were used to verify the calculated theoretical values for convergence and error.

The wave equation is further hypothesized to be subject to diffusive errors, whereby waves with sharp corners become rounded upon analysis with this iterative time scheme, and dispersive errors, whereby well-defined waves spread out over time. Both PDEs will also be tested to see whether they are well-posed, i.e. whether their evolution can be retraced backwards in time. These results are important in understanding the extent to which finite-difference approximations can be used to model everyday phenomena.

Rebecca Dipucchio – Transforming the Development of Inquiry Skills: To What Extent Does Participation in an Inquiry Course Enhance the Development of Inquiry Skills as Compared to Other Inquiry-Based Opportunities?

Inquiry Based Learning (IBL) within chemistry has been well characterized in previous literature, but focuses entirely on inquiry based labs or inquiry in a general first year course. As well, there is no documented IBL material specific to Chemical Biology courses and no literature exists to document student perceptions of any upper year inquiry-based course. McMaster continues its history of innovation in IBL with ChemBio 2Q03, Inquiry for Chemical Biology. This second year course is required for all Honours Chemical Biology students and exposes students to a set of IBL skills, as defined from the literature. A key goal for ChemBio 2Q03 is for students to develop these skills and to prepare students for future inquiry based experiences. One such experience is the inquiry project in Chem 3AA3, Instrumental Analysis. Chem 3AA3 is completed by both Chemistry and Chemical Biology students together.

This study investigated the perceptions of level three, four, and five Honours Chemistry and Honours Chemical Biology students regarding the development of their IBL skills for application in the Chem 3AA3 inquiry project. Within this, the role of ChemBio 2Q03 in IBL skill development as compared to other IBL skill development opportunities was determined. All study data were obtained through the combination of an online survey filled out by level three, four, and five Chemistry and Chemical Biology students as well as in person interviews conducted with instructors and teaching assistants from ChemBio 2Q03 and Chem 3AA3. Information was analysed both qualitatively and quantitatively, by looking for quantitative survey trends and combining survey data with interviews to identify themes. The survey results will indicate whether students perceive that ChemBio 2Q03 improves their own IBL skills, and whether this course stands out among other possible IBL experiences for Honours Chemistry and Honours Chemical Biology students. In addition to discussing any qualitative or quantitative trends, possible contrasts between responses presented by students in the online survey and instructors from in person interviews will be explored. These results will be given context in the broader body of literature surrounding inquiry in chemistry and they will be compared to any existing published information on student perceptions of IBL.


Kerri Kosziwka – Case Studies as a Pedagogical Tool

Students in the Integrated Science (iSci) Program at McMaster University in Hamilton, Ontario, Canada learn concepts through methods that are not common in Canadian Universities. With a limited enrolment of 60 students per year, a variety of techniques involving problem-based learning are used and are effective. This project explores the benefits of one of the learning techniques commonly used in this program: case studies. Case studies are often used as a way to promote critical thinking skills. In university, these skills are often not developed until later on in a student’s education, as in large classes success is usually determined by the ability to memorizing facts. Case studies however, are an effective method of teaching that helps to promote active learning, help problem solving, and encourage the development of critical reasoning and analysis skills.

Two case studies to teach concepts from the Life Sciences component of iSci 1A24 are being made in differing formats: a handout with questions and a PowerPoint. Each of the case studies revolves around the same model organism: Crown-of-Thorns Starfish (COTs). By using the same organism in each case study, students will become receptive to the familiarity and consistency. The first case study aims to teach evolutionary concepts consistent with those learned in iSci 1A24.  This case study will look more into paleobiology and the fossil records connection to evolution, as well as problems that can arise in marine systems with fossilization. Next, ecology concepts will be taught from a PowerPoint presentation and iClicker questions. As ecology is the science of interactions students will explore how COTs interact with their environment. This is discussed through a presentation that outlines their predators, prey, feeding habits, abundance, and conservation efforts.

The use of these case studies work on two levels: the implications on the students and the implication on the broad use of case studies. In terms of iSci specifically, students become very comfortable with group settings through class discussions and group projects. By adding to the amount of collaborative learning, students will have enhanced satisfaction with the entire learning process. In addition, by discussing the topics with a group or the whole class, students are exposed to a variety of ideas, which will extend their engagement further.

Nicholas Goncharenko – Using the Arduino Platform to Teach Neuroscience Students Information Processing and Scientific Models in the Context of the Visual System.

Many undergraduate students receive limited exposure to analysis of scientific models and complex systems. Specifically, only an estimated 7% of students in North America have taken a course in computer science before attending university. This is problematic, as modern science often requires translating scientific models and complex systems into those able to be simulated by a computer. In an effort to make computer science more accessible and hands-on to these students, a proposal has been developed in the form of an undergraduate lab that could be carried out at McMaster University, aimed at teaching students, information processing and scientific models in the context of the visual system. This lab will provide students with the opportunity to model visual systems using Arduino microcontrollers: a small computer designed to do one task at a time, whose hardware and software are open source. Arduino microcontrollers were chosen as they can be used to model neurons as computers. Students will be shown a model of the visual system that uses Arduino microcontrollers to sense light and correctly identify colours. Students will then be challenged to construct their own alternate model of the visual system. At their disposal will be modified Arduino microcontrollers and software programs, which can be used to test their model. At the end of this lab, students should understand the process behind building a scientific model and gain an understanding of important concepts such as emergent properties. A significant part of the presentation will focus on the benefits of using Arduino to teach scientific concepts, mainly its cost effectiveness, adaptability and multidisciplinary use to teach many important concepts in science outside of neuroscience.

Mackenzie Richardson – Improving the McMaster Outdoor Orientation Student Experience (MOOSE)

The McMaster Outdoor Orientation Student Experience (MOOSE) program is a first year experience (FYE) that aims to help incoming students transition into life at McMaster University. Currently preparing for its third session, a total of 120 students from Arts and Science 1, Integrated Sciences 1, Kinesiology 1, and Social Sciences 1 have attended MOOSE. The program uses camping and canoe tripping to form an outdoor education setting, where students form relationships with peers and faculty, learn about McMaster and its programs, and better understand other aspects of university life. MOOSE is looking to evolve so it can provide the best possible experience for the greatest number of students. With this in mind, feedback from past participants and examination of other outdoor FYE programs from different universities is necessary.

This research project aimed to accomplish both these tasks. Past MOOSE participants were invited to participate in an online survey, which focused on how effective and important they perceived MOOSE to be for their transitions into university life. A literature review of available research on other outdoor FYE programs was conducted, drawing inspiration for how the program can improve. The results of the literature review and the online survey were analyzed and compiled into a manuscript and a set of training documents for future MOOSE student leaders. The manuscript summarizes the literature review, and makes suggestions for changes to the MOOSE program.

The research performed is very important for the future posterity of the MOOSE program. Research has shown that effective FYE and transitional programs can dramatically increase how welcome students feel at a university, decrease their perceived levels of stress, and improve their abilities to form relationships. Overall, this can lead to higher student retention at an institute, and improve the overall perceived satisfaction of the university experience.


Matt Galli & Mary Kate MacDonald – Comparing Student Stress Levels in Interdisciplinary Programs at McMaster University

Undergraduate students experience high degrees of stress due to the transition to a more independent life, and the high volume and consistency of academic demands and evaluations associated with university. This stress is frequently correlated with illnesses, both physical and psychological in nature. Specific to students, high levels of stress often result in a decrease in academic success. In order to address the issue of student stress, it is paramount to identify student populations that experience augmented stress levels in addition to the potential causes, or stressors. The evolution of novel and unconventional undergraduate science pedagogies and teaching environments, exemplified by the Integrated Science program at McMaster University, which employs a problem-based and small group learning style, contrasts with the more traditional large-scale, lecture-based teaching styles of the Life Sciences Program. In light of these novel approaches to learning, there is a need to understand how these new techniques affect the perceived stress of science undergraduate. This project involves using an online survey sent to students in first and third year of both the Life Sciences and Integrated Science programs in order to quantify the potential differences in their perceived stress, and the potential causes and coping mechanisms unique to their programs. The survey contains 19 questions, takes approximately 10 minutes to complete, and is divided into three main sections. The first part aims to categorize the student in terms of year, program, and gender. The second employs the perceived stress scale to attain a quantitative measurement of perceived stress. The third section aims to elucidate program-specific reasons for any trends that appear, with questions investigating available coping mechanisms, learning strategies, and teaching strategies. Ultimately, this research is intended to understand the sources of stress in undergraduate science programs that are derived from their academic environment. A better understanding of environmental stressors related to various programs and pedagogies will provide motivation and rationale on how to reduce student stress, and input for how to improve existing pedagogies to minimize stress while maximizing the student learning experience.

Jonathan Park – From ancient calendars to the calendar today

The world moves in a continuous cycle of time. It has become natural from the moment of birth that most people treat the calendar as if it has always existed the calendar before. People now make plans in hour, day, week, month, and even in year units. It is now almost impossible to imagine the world without a calendar system. The calendar is a very complex system that was not created instantly. There were many different calendars that emerged in the past that either disappeared in the course of history or influenced the calendar that is being used today.

The purpose of this project is to investigate different types of calendars that existed in the past, and how they are interconnected with or isolated from each other. The main calendar systems investigated were the Egyptian, the Mayan, the Babylonian, the Julian (Roman), the Hebrew (Jewish), and the current calendar. Each system has its own distinctive feature and they also share similar characteristics. In addition, historical processes such as calendar reformations will be discussed to demonstrate the transition from ancient to modern calendar systems.

The investigation was conducted by means of a literature review, by examining library catalogues and journal articles. The catalogues were available from the libraries of McMaster University and University of Toronto Mississauga, and journal articles in online format were accessed via library websites.

The results suggest that the current calendar system is influenced mostly by the Babylonian, Julian, and Hebrew calendar that dominated most regions of Europe and Middle East. However, the Mayan calendar was one of the most complicated calendars in ancient history and unfortunately, the self-destruction of Mayan culture due to civil wars did not allow the knowledge to be passed on. The Egyptian calendar that dominated North African region disappeared soon after the expansion of Roman Empire.

Looking into different calendars in the past and the current calendar show how humanity continuously struggled to create a better version of calendar in order to have more efficient time-keeping and more systemized society.


Alexandra Kasper – A NetLogo Model for Fractioned Radiation Treatment

Radiation therapy is one of the most common methods for treating cancer.  When mammalian cells are irradiated, their chance of survival depends on many factors including dose and type of cell. The effect of radiation on cell viability can be expressed through cell survival curves. The relationship between dosage and cell viability can be determined experimentally by measuring the surviving fraction of cells after exposure to varying amounts of radiation. Recognizing the differences between the response of tumour cells and normal tissue to irradiation is crucial to treatment planning of radiation therapy for cancer patients. Additionally, understanding cell survival curves can help to explain the actual mechanism of radiation damage on the cellular level: what is happening within the cell that causes radiation to kill some of the population?

Presently, there are many mathematical models which agree with cell survival data to varying degrees of accuracy. The linear quadratic (LQ) model is one of the most widely used models in the teaching of cell survival curves. The LQ model utilizes α/β values, which are a measure of the sensitivity of the tissue to radiation, and also determine the curvature of the cell survival curve. Rather than using a single high radiation dose, following a fractionated dose schedule can amplify the response differences between the tumour and normal cells and ideally maximize damage to tumour cells, while minimizing damage to surrounding healthy tissue. Developing a fractionated radiation treatment schedule requires consideration of the α/β value, applied dose, and treatment frequency.

In this project I developed a NetLogo model which allows the user to adjust the α/β value, applied dose, and frequency of treatment to create an ideal fractionated radiation treatment schedule. NetLogo is an agent based programming environment which is designed to be an educational tool. This model uses the LQ model and other concepts from iSci’s first year cancer research project and is intended for possible future use within the cancer project. This model will allow students to compare different clinical treatment schedules such as hyperfractionation and conventional strategies, as well as observe the consequences of stopping treatment before completion.

Rebekah Ingram – Potential Contamination of a Private Drinking Water Well

During the Pleistocene Epoch, the Laurentide Ice Sheet slowly advanced to cover a large portion of North America, with its maximum size and thickness occurring approximately 20 000 years before present. This glaciation deposited a thick layer of sediment across the area known today as Southern Ontario. In many areas across Southern Ontario, glacial deposits serve as the primary aquifers for municipal and private drinking water needs.

This study involves the investigation of the cause of a water quality issue encountered at a private drinking water well in rural Ontario. Water from this well is characterized by a foul odour comparable to rotten eggs, foaming or fizzing at the tap, and brownish or blackish deposits. This water does not exceed any Ontario Drinking Water Standards, however has been shown to have levels of iron and manganese far higher than their aesthetic objectives. The area in question is underlain by glaciolacustrine deposits of surficial sand, gravel and silt. The well draws from an unconfined aquifer which makes it susceptible to contamination. It was previously determined that the high iron and manganese concentrations coupled with a lack of nitrate and sulphate in the well water indicated a reducing environment in which sulphate reducers could be biofouling the groundwater. It has been theorized that these reducing conditions in the complainant’s well could be due to the past discharge of wastewater into the aquifer by a food grade trucking company located directly across the street from the complainant’s property. In 2013, three monitoring wells were installed on the trucking company’s property to determine the direction of groundwater flown and water quality. This project will involve additional analysis of the hydrochemical, hydrogeologic, and sedimentologic data collected during investigation of the complainant’s water issues to determine if the trucking company is responsible for these water quality issues. Water quality data will be compared to Ontario Drinking Water Quality Standards, to the water quality of nearby wells, and to typical geochemical parameter values in glaciofluvial sediments.  A subsurface geology map of the site will also be made using well record data and a logged sediment core. If the source of the water quality issue is determined to be natural rather than anthropogenic, the results of this study may be applicable to the quality of water drawn from glacial aquifers across Southern Ontario.

Nathaniel Smith – Nuclear War and Nuclear Peace: A Holistic Approach to the Manhattan Project

In 1939 Albert Einstein and Leo Szilárd wrote a letter to Roosevelt addressing the concern that Germany was developing weapons of mass destruction. Based on the discovery of nuclear fission by German physicists Otto Han and Lise Meitner, Szilárd proposed that the nucleus of an atom could be unlocked to unleash inconceivable amounts of energy. The fear of this weapon in Nazi hands motivated Roosevelt to take action and the Manhattan Project was born. The following seven years saw the collaboration of the greatest minds in the history of science, from Niels Bohr to Richard Feynman. The Manhattan project was a large-scale, classified operation, employing 129,000 people, including construction workers, plant operators, and military personnel. The project consumed US$26 billion (2014) and mined thousands of tons of uranium from Canada and the Belgian Congo. On July 16, 1945, the world’s first nuclear bomb was detonated in Trinity Site, New Mexico, and the Atomic Age had officially begun. This plutonium implosion device had the same design as the Fat Man, which devastated Nagasaki on August 9, 1945. The Japanese Instrument of Surrendered was signed on September 2, and the Manhattan Project was replaced by the Atomic Energy Act of 1946. The Manhattan Project changed the course of history, and is controversial on many levels. The death toll of Hiroshima and Nagasaki was 185,000. American scientists expressed moral conflict, and circulated the Franck Report, which attempted to halt the use of the atom bomb. In fact, historical literature suggests that Japan was considering surrender before Nagasaki, yet Truman’s motivation to use the atom bombs was a means of intimidation to the Soviet Union. To this day, it is still debated whether the use of nuclear weapons on Japan was a defensive effort or war crime. The Manhattan Project’s positive influence on science, however, is impossible to debate. During the Manhattan Project, Glenn Seaborg revolutionized the periodic table by discovering 9 new transuranium elements (94 through 102) and distinguishing the actinide series. Nuclear medicine also relies on the hundreds of radioactive isotopes discovered and produced by Seaborg. Lastly, the dependence of modern society on electricity (The United States generated 769.3 billion kWh in 2012) is due to the pioneering of Enrico Fermi’s Chicago Pile-1 Reactor. Indeed, the impact of the Manhattan Project is vast in both devastating human civilization and supporting it. It is for this reason that young scientists should learn of its impact, to seek ethically healthy research and avoid disasters.

Jacqui Rotondi – Phenotypic plasticity in Eutrema saluginea’s herbivore defence mechanisms

Phenotypic plasticity in Eutrema salsuginea’s herbivore defence mechanisms

Eutrema salsuginea is a crucifer of the Brasscia (mustard) family. The natural accession native to Yukon Territory, Canada shows phenotypic plasticity through tolerance to many abiotic stress factors including extreme cold, salt, drought, and nitrogen limitation. In the Yukon E. salsuginea grows on high sulfur soil and crucifer plants frequently use sulfur metabolites called glucosinolates for defence against herbivores. We are testing the hypothesis that E. salsuginea shows plasticity with respect to herbivore defence strategies in high versus low sulfur environments. According to the resource allocation hypothesis, some plants can distribute nutrients to serve various functions based on the availability of that nutrient. Therefore, more available sulphur could result in stronger herbivore defence mechanisms.

To explore the possibility of phenotypic plasticity in defence mechanisms, we grew E. salsuginea plants in sulfur rich and sulfur poor soil then inoculated the plants with Green Peach aphids (Myzus persicae). The total number of aphids per plant was counted daily and statistical testing was used to compare the number of herbivores on the sulfur rich plants to the sulfur poor plants. We predicted that improving the capacity of the plants to make glucosinolates by a growing on the sulfur rich soil would reduce aphid numbers in comparison to the sulfur poor plants. If the predicted results are supported, this experiment would be evidence of plasticity of E. salsuginea’s herbivore defence mechanisms.

George Wells – The Life of Thales

This paper is a literature review on the formation of the school of Ionia, focusing on the life of Thales. This is not only a literature review but a critical assessment of the plausibility of different sources. The school of Ionia was founded by Thales of Miletus (624- 546 BC), who is praised with being the first western philosopher to replace superstitious thinking with rational thought in explaining the world. There is a lot of accreditation to Thales: he predicted an eclipse in 585BC and found several geometrical theorems (e.g. a circle is bisected by its diameter and the angles at the base of an isosceles triangle are equal).  However, Thales did not leave any written work – or at least no written work that survived to the modern day. – Thales’s life and his teachings are recorded by a number of other famous philosophers such as Aristotle and Plato.

As a result, present literature about Thales is spread out over a vast number of sources that make a wide variety of claims about Thales’s postulations and life. Some sources tend to be more definitive about who Thales was and what he did while other authors are more skeptical. The aim of this review is to determine different aspects of Thales’s life, his possible travels and his teachings, by evaluation of available sources of such information. The next part of the paper will look at the culture of Miletus and at the Greek religion which allowed the school of Ionia to form. My hypothesis is that authors should be more skeptical about the life of Thales and that he had too many teachings which have been attributed to him. I hypothesize this because the philosophers that wrote about Thales did so after his death.

This study is relevant because Thales is considered to be the founder of science as we know it, but is hardly known to the general public.  He had an extremely interesting life if you combine all the attributions to him, but it is important to know which are most plausible. Finally, there does not yet seem to be a complete analysis of the life of Thales as sources tend to focus on one aspect of Thales’s life or teachings. This could be another outcome of the paper.

Katie Woodstock & Laura Hogg – Interspecific Competition Between Semi-Feral Horse Herds and Giant Pandas in the Wolong National Nature Reserve: Modelling the Impact of Domestic Livestock on Endangered Species

The resources required to support domestic livestock across the globe are tremendous, resulting in habitat disruption and deforestation. On top of this, recent studies indicate that free-roaming livestock may significantly affect the population sizes of at-risk and endangered species through interspecific competition.

One instance of this is in the Wolong National Nature Reserve, where the giant panda population is impacted by farmers allowing their horses to roam in the surrounding forests. The regions and sample sizes used in the study were limited by logistical constraints. This study extrapolates the data obtained from a study of the region by modelling the point at which livestock begin to negatively affect the giant panda population. The reserve, created to protect the endangered giant panda population, also houses several native communities that rely on subsistence farming. Increases in the horse trade have led to the release of horses within the reserve’s forests, where they are eventually recaptured and sold as the need arises. This practice gives farmers a reserve of potential income, without depleting resources on their farms.

Giant pandas are specialist feeders; bamboo meeting almost all of their caloric needs. Semi-feral horse herds on the reserve also select bamboo as their food source, and the increase in demand caused by the introduction of horses has resulted in a decline in bamboo availability.

The data from this experiment was used to create a NetLogo program. A map of the reserve was overlaid onto a coordinate system where each patch was ranked according to habitat suitability and food availability. Intrinsic qualities such as the lifespan and reproductive rates of each species and the growth rate of bamboo were embedded in the code for the program, while the initial size of each population was manipulated using sliders. It was found that, due to the higher reproductive rate and longer lifespan of the horses, their growth rate per capita significantly exceeded that of the pandas. Starting with the current population of each species, both populations initially increased. Once the horse population exceeded a threshold amount, the giant panda population decreased to zero while the horse population continued to increase until it reached carrying capacity. These results substantiate the measures being taken within the Wolong National Nature Reserve to decrease the semi-feral horse population. The code used in this program can be used with minor alterations to model interspecific competition concerning endangered species in other regions as well.

Eric Turner – Vortices in the Diffraction Pattern of a Particle Beam

Diffraction has been a key instrument in understanding properties of waves, light and quantum particles ever since Young’s fundamental two-slit experiment. It is one of the most powerful tools in understanding wave-particle duality. A simple result of diffraction experiments, with 2 or more slits, is interference based on the superposition of waves. When the superposition sums to zero we find the wave function is equal to zero at that point and if the solution is zero then the phase is indeterminate. In 1974 Nye and Berry came out with a paper, “Dislocations in Wave Trains”, regarding topology of waves including vortices and singularities. They refer to a singularity as a dislocation. Whereas a vortex is a dislocation morphology known as the pure screw dislocation (Nye and Berry, 1974; Berry 1981). Vortices come up in many different physical systems and describe various phenomena in optics, acoustics, hydrodynamics and quantum mechanics.

A paper on the vortices in quantum mechanics by O’Dell describes the properties of a diffracted beam of atoms by a standing wave of light. The research works through solving the evolution of the wave function and its behaviour as they pass through the standing wave and diffract. To describe the behaviour involves solving Schrödinger’s equation, which will be in the form of the Raman-Nath equation (Mathieu’s Equation).

This paper hopes to explore the topology of the wave functions of diffracted atoms, specifically to identify vortices. The atoms will be diffracted through a standing wave of light imparting a potential on the beam of atoms. The beam of atoms obey the Schrödinger equation, but after passing through the potential, the beam conforms to a specific form of the Schrödinger equation known as the Raman-Nath (RN) equation. The RN equation yields the behaviour of the amplitudes and phase of the wave function, describing how it evolves in time. By analyzing the topology given by the RN equation we hope to find vortices.

The aim of this project is to find a method by which to determine where vortices are occurring using only mathematical analysis, sight, and computing. The final product will be a numerical method to automate the identification of vortices as well as create images that highlight the topology of these vortices and provide a qualitative analysis.

Alex Shephard – Investigating time management and lifestyle in interdisciplinary programs at McMaster University: A Pilot Study

Undergraduate science students lead busy lives, and the effective implementation of time management skills is crucial for academic success (Kember et al., 1996). University programs dictate the amount of time students devote to in-class learning, while students are responsible for finding time to complete the required coursework. Time devoted to coursework may significantly differ between students in different university programs (Ruiz-Gallardo et al., 2010).  Depending on the nature of the coursework, time devoted to other basic life activities such as paid work, extracurriculars, leisure, and sleep could be compromised (Macan et al., 1990).

Integrated Science is a four-year undergraduate science program at McMaster University. The coursework largely consists of supervised, inquiry-based learning through group research projects. Life Sciences is an alternative program to Integrated Science at McMaster, based primarily on a lecture format. These two programs are similar in content learned but differ in terms of teaching strategy and workload style (McMaster University, 2014). It is hypothesized that these differences may lead to differences in time spent on both out-of-class learning and other activities typical to student life.

The first goal of our research is to test the hypothesis that students enrolled in Integrated Science I differ from students enrolled in Life Sciences I in terms of time allocated to basic life activities. Students from Level I Integrated Science (n=35) and Level I Life Sciences (n=800) will complete an online survey to estimate the amount of time allocated to life activities such as paid work, extracurriculars, leisure, and sleep, in an average university week. Students will then complete a perception-based survey to indicate their satisfaction with time spent on these activities.

The second research goal is aimed specifically at students in Integrated Science I, who face a rigorous and diverse workload. The question will address whether these students allocate an appropriate amount of time to the tasks that make up their coursework, based on the weighting of the tasks in the overall grading scheme of the course. Students will complete an additional online survey to estimate time devoted to group-based projects, small assignments, and studying in an average university week, and then an additional perception-based survey to indicate their satisfaction with their time management.

These data could be useful for educators and program designers, who strive to design university programs that maximize student learning while maintaining a workload that is manageable enough for students to be successful. Additionally, the results could benefit first year course design of the Integrated Science program. Any inconsistencies between time allocation and mark distribution could indicate where students are having difficulties managing time, potentially calling for refinement of course structure.

Hanna Stewart & Pratik Samant

No abstract provided.

Ben Windeler

Objectives: The value of statistical analysis in professional hockey has been widely debated. This report provides an introduction to common statistical measures used to predict the performance of NHL hockey teams, an in-depth explanation of the methods used to choose these measures, and a layman’s explanation of their significance as a predictor of success. Following this is original statistical analysis. First, using these statistical measures as predictors of playoff success, which is arguably the most important benchmark for team success. Second, analyzing specific game outcomes: overtime (OT) and shootouts (SO). These games have particular importance in the NHL, as teams are awarded a point for an overtime or shootout loss, which biases teams to extend non-division games into overtime to guarantee a point.

Methods: The majority of public research on statistics in hockey comes from an online community of bloggers and amateur statisticians. The foremost part of this report synthesizes results claimed by these sources, outlines the exact methods used to obtain the results, and provides well-referenced justification for, and explanation of, the results.

Original analysis was conducted by looking at Pearson correlation coefficients. The R software package was used for all data analysis.

Results: All of the external results that were looked at were fairly simple to reproduce, but the significance of these results was often exaggerated. In general, while the methods used seem to have been appropriate, their utility for predicting future team success was poor at best.

Analysis showed that the most significant indicator of a team’s success in a given playoff series was its performance against the opposing team during the regular season. The importance of different statistical measures in determining how often teams would go to OT or SO was miniscule, as this seems to be dominated by chance.

Conclusions: It is important for any statistical analysis to thoroughly describe its methods in order to provide an unbiased and reproducible predictive model. This report provides these qualities to the typically poorly sourced body of statistical analysis in the NHL. It also highlights useful predictors for playoff success. Finally, the report demonstrates the inherent randomness of points provided in overtime losses and makes an argument against the validity of awarding points in this scenario.

David Yun – Simulating the formation of complex systems

The Earth’s organisms are too complex to have been formed through strictly random processes (Bonner, 1988; Dawkins, 1986). Current theories for the formation of biological complexity are based on Darwin’s theory of evolution by means of natural selection (Vinicius, 2010). Critics of natural selection argue that a sentient and intelligent designer is required to explain the complexity exhibited in biological structures (Dawkins, 1986; Paley, 1802; Discovery Institute, 2014). Simon (1962) describes a thought experiment imagining two brothers making 1000-component watches. One brother takes a stepwise approach, attempting to assemble all 1000 pieces in a single run, but he loses all of his work each time he is interrupted. The other brother uses a modular approach, constructing 100 subunits of 10 components each. He then combines these subunits into 10 larger units of 100 components each. Finally, he combines these larger units to create a finished watch. In using this approach, he only loses the progress on his current subassembly when interrupted. In this research, Simon’s “watchmaker” parable was evaluated through computer simulation to compare the efficiency of modular construction to an unstructured, stepwise construction. A model of Simon’s parable was generated in Maple using the probability of interruption (p) during construction and assembly structure as the variable parameters. Combinations of these two variables were tested to research their effects on the relative efficiency between hierarchical and stepwise construction. Input parameters for Simon’s parable are also presented in an assignment for an evolutionary biology course at McMaster University. These parameters were tested to evaluate the assignment’s use as a teaching aid.  Modular construction productivity decreased linearly as p increased, while stepwise productivity decreased exponentially. It was also found that Simon’s calculation overstated the productivity of both modular construction and stepwise construction. His calculated ratio for productivity (modular 4000 times more efficient than stepwise) was close to the simulated value (2850). Changes to the modular assembly structure did not produce measurable effects on productivity. Under the conditions specified in the assignment, it was found that modular construction is more productive than stepwise for p >0.2. Since p = 1/6 in the assignment, the instructions need to be modified to demonstrate the advantage of modular construction. The findings of this research support Simon’s hypothesis that modular construction is advantageous to stepwise construction when there is a sufficient probability of losing progress. This idea has been applied in evolutionary biology to explain the accumulation of favourable mutations (Vinicius, 2010).