“Extreme weather leaves Mediterranean countries picking up the pieces. Egypt and Lebanon were the hardest hit with over 1.2 million people displaced overnight. Malta didn’t fare much better. The authorities have reported over 2,300 dead or missing, thousand injured and 74,000 persons displaced. Power cuts have been reported all over the island after Turbine Two tripped at the Delimara Power Station. Enemalta have not replied. The islands have taken a major blow to their infrastructure. Debris has been reported 1 km away from the coasts. The AFM and emergency responses were immediately dispatched and are starting to clear arterial roads. Insurance companies are still counting the costs. Valletta, Floriana and parts of Isla were protected from the storm surge by centuries-old Knight’s fortifications. The following localities have been affected: Birgu, Bormia, Kalkara, Marsa, Gzira, Msida, Pietà, San Giljan, Sliema, Ta’Xbiex, Xghajra, Birzebbuga, Marsascala, Marsaxlokk, Xlendi and Marsalforn. “
The above cutout could become reality if a Category 3 storm lashes Malta with 178 to 208 km per hour winds. The chances are minimal but too probable to ignore, since in 1995 a similar storm formed close to the Maltese Islands followed by others in 1996, 2006, and 2011. Below are two scenarios that compare Malta as it currently stands against an island with a solid disaster management plan.
[ SCENARIO 1 – AN UNPREPARED ISLAND]
The emergency forces have been inundated with calls for help and have few plans to operate a workable rescue effort. Key personnel were lost at home or while rushing to the scene, since the infrastructure has been knocked out, paralysing the island. Power surges or power cuts have caused fires all over the Islands creating an apocalyptic scenario. With the storm still raging, the lack of a back-end ICT network has rendered communication near impossible.
[ SCENARIO 2 – THE IDEAL SCENARIO]
A fleet of small aerial drones is monitoring the disaster. The authorities are using them to identify the hardest hit areas and map out corridors that allow access on the ground. Emergency vehicles are being deployed safely. Services will be redeployed after safety assessments and clearing of the main infrastructure. Paramedics, NGO rescue teams, and armed forces help move people to safer grounds and carry out rescue operations. Community buildings on higher ground are converted into temporary shelters. In turn, decision-makers are kept informed using an Emergency Room for effective relief.Continue reading
Imagine the smallest thing you possibly can. The eye of a needle? A human hair? A particle of dust? Think smaller, something you cannot even see, something on a molecular scale. Now imagine that molecule has the potential of a whole laboratory. This dream is now becoming a reality.
In recent years, the field of molecular sensors has grown into one of the most ground-breaking areas in Chemistry. Molecular sensors are compounds that can detect a substance, or unique mixture of substances, and provide an easily detectable output. Usually this is a change in the absorption of ultraviolet or visible light, or in the emission of Fluorescence. In other words: colours!
John Gabarretta (supervised by Dr David Magri) created a simple example of these fluorescent molecular sensors. The molecule was based on the Fluorophore-Spacer-Receptor model, where the ‘output’ part of the molecule (the fluorophore — a structure which shines light) is separated from the ‘input’ part (the receptor — a structure which is sensitive to a particular substance, such as acidity or a metal ion) by an intermediate spacer, whose main function is to link these two components together. The model means that a molecule can detect a chemical and respond by shining light or not. The process gives information about the chemicals in a solution.
The molecule was made by a two-step synthetic route (which took several attempts and resulted in several different colours), and its behaviour was tested by dipping into an acid. In water the molecule was switched ‘off’, but quickly turned ‘on’ in an acidic solution by giving a bright blue light when exposed to ultraviolet light (UV) — a pretty satisfying sight!
Molecular sensors have some very advanced applications — the pioneer A. P. de Silva said that there is room for a “small space odyssey with luminescent molecules”. This odyssey includes some that detect substances such as sugars.
While very advanced systems are approaching chemical computers, since they have multiple inputs and use Boolean Logic, the so-called ‘Moleculator’ or ‘gaming tic-tac-toe’ systems. The future is bright (if you pardon the pun) and with more complex structures more possibilities will appear; the molecular laboratory may become a reality detecting diseases or toxins in no time at all.
This research was performed as part of a Bachelor of Science (Honours) at the Faculty of Science.
Doctors regularly need to use endoscopes to take a peek inside patients and see what is wrong. Their current tools are pretty uncomfortable. Biomedical engineer Ing. Carl Azzopardi writes about a new technology that would involve just swallowing a capsule.
Michael* lay anxiously in his bed, looking up at his hospital room ceiling. ‘Any minute now’, he thought, as he nervously awaited his parents and doctor to return. Michael had been suffering from abdominal pain and cramps for quite some time. The doctors could not figure it out through simple examinations. He could not take it any more. His parents had taken him to a gut specialist, a gastroenterologist, who after asking a few questions, had simply suggested an ‘endoscopy’ to examine what is wrong. Being new to this, Michael had immediately gone home to look it up. The search results did not thrill him.
The word ‘endoscope’ derives from the Greek words ‘endo’, inside, and ‘scope’, to view. Simply put, looking inside our body using instruments called endoscopes. In 1804, Phillip Bozzini created the first such device. The Lichtleiter, or light conductor, used hollow tubes to reflect light from a candle (or sunlight) onto bodily openings — rudimentary.
Modern endoscopes are light years ahead. Constructed out of sleek, black polyurethane elastometers, they are made up of a flexible ‘tube’ with a camera at the tip. The tubes are flexible to let them wind through our internal piping, optical fibers shine light inside our bodies, and since the instrument is hollow it allows forceps or other instruments to work during the procedure. Two of the more common types of flexible endoscopes used nowadays are called gastroscopes and colonoscopes. These are used to examine your stomach and colon. As expected, they are inserted through your mouth or rectum.
Michael was not comforted by such advancements. He was not enticed by the idea of having a flexible tube passed through his mouth or colon. The door suddenly opened. Michael jerked his head towards the entrance to see his smiling parents enter. Accompanying them was his doctor holding a small capsule. As he handed it over to Michael, he explained what he was about to give him.
Enter capsule endoscopy. Invented in 2000 by an Israeli company, the procedure is simple. The patient just needs to swallow a small capsule. That is it. The patient can go home, the capsule does all the work automatically.
The capsule is equipped with a miniature camera, a battery, and some LEDs. It starts to travel through the patient’s gut. While on its journey it snaps around four to thirty-five images every second. Then it transmits these wirelessly to a receiver strapped around the patient’s waist. Eventually the patient passes out the capsule and on his or her next visit to the hospital, the doctor can download all the images saved on the receiver.
The capsule sounds like simplicity itself. No black tubes going down patients’ internal organs, no anxiety. Unfortunately, the capsule is not perfect.
“The patient just needs to swallow a small capsule. That is it. The patient can go home, the capsule does all the work automatically”
First of all, capsule endoscopy cannot replace flexible endoscopes. The doctors can only use the capsules to diagnose a patient. They can see the pictures and figure out what is wrong, but the capsule has no forceps that allow samples to be taken for analysis in a lab. Flexible endoscopes can also have cauterising probes passed through their hollow channels, which can use heat to burn off dangerous growths. The capsule has no such means. The above features make gastroscopies and colonoscopies the ‘gold standard’ for examining the gut. One glaring limitation remains: flexible endoscopes cannot reach the small intestine, which lies squarely in the middle between the stomach and colon. Capsule endoscopy can examine this part of the digestive tract.
A second issue with capsules is that they cannot be driven around. Capsules have no motors. They tend to go along for the ride with your own bodily movements. The capsule could be pointing in the wrong direction and miss a cancerous growth. So, the next generation of capsules are equipped with two cameras. This minimises the problem but does not solve it completely.
The physical size of the pill makes these limitations hard to overcome. Engineers are finding it tricky to include mechanisms for sampling, treatment, or motion control. On the other hand, solutions to a third problem do exist. This difficulty relates to too much information. The capsule captures around 432,000 images over the 8 hours it snaps away. The doctor then needs to go through nearly all of these images to spot the problematic few. A daunting task that uses up a lot of time, increasing costs, and makes it easier to miss signs of disease.
A smart solution lies in looking at image content. Not all images are useful. A large majority are snapshots of the stomach uselessly churning away, or else of the colon, far down from the site of interest. Doctors usually use capsule endoscopy to check out the small intestine. Medical imaging techniques come in handy at this point to distinguish between the different organs. Over the last year, the Centre for Biomedical Cybernetics (University of Malta) has carried out collaborative research with Cardiff University and Saint James Hospital to develop software which gives doctors just what they need.
Following some discussions between these clinicians and engineers they quickly realised that images of the stomach and large intestine were mostly useless for capsule endoscopes.
Identifying the boundaries of the small intestines and extracting just these images would simplify and speed up screening. The doctor would just look at these images, discarding the rest.
Engineers Carl Azzopardi, Kenneth Camilleri, and Yulia Hicks developed a computer algorithm that could first and foremost tell the difference between digestive organs. An algorithm is a bit of code that performs a specific task, like calculating employees’ paychecks. In this case, the custom program developed uses image-processing techniques to examine certain features of each image, such as colour and texture, and then uses these to determine which organ the capsule is in.
Take colours for instance. The stomach has a largely pinkish hue, the small intestine leans towards yellowish tones, while the colon (unsurprisingly perhaps) changes into a murky green. Such differences can be used to classify the different organs. Additionally, to quickly sort through thousands of images, the images need to be compacted. A specific histogram is used to amplify differences in colour and compress the information. These procedures make it easier and quicker for algorithm image processing.
Texture is another unique organ quality. The small intestine is covered with small finger-like projections called villi. The projections increase the surface area of the organ, improving nutrient absorption into the blood stream. These villi give a particular ‘velvet-like’ texture to the images, and this texture can be singled out using a technique called Local Binary Patterns. This works by comparing each pixel’s intensity to its neighbours’, to determine whether these are larger or smaller in value than its own. For each pixel, a final number is then worked out which gauges whether an edge is present or not (see image).
Classification is the last and most important step in the whole process. At this point the software needs to decide if an image is part of the stomach, small intestine, or large intestine. To help automatically identify images, the program is trained to link the factors described above with the different organ types by being shown a small subset of images. This data is known as the training set. Once trained, the software can then automatically classify new images from different patients on its own. The software developed by the biomedical engineers was tested first by classification based just on colours or texture, then testing both features together. Factoring both in gave the best results.
“The software is still at the research stage. That research needs to be turned into a software package for a hospital’s day-to-day examinations”
After the images have been labeled, the algorithm can draw the boundaries between digestive organs. With the boundaries in place, the specialist can focus on the small intestine. At the press of a button countless hours and cash are saved.
The software is still at the research stage. That research needs to eventually be turned into a software package for a hospital’s day-to-day examinations. In the future, the algorithm could possibly be inserted directly onto the capsule. An intelligent capsule would be born creating a recording process capable of adapting to the needs of the doctor. It would show them just what they want to see.
Ideally the doctor would have it even easier with the software highlighting diseased areas automatically. The researchers at the University of Malta want to start automatically detecting abnormal conditions and pathologies within the digestive tract. For the specialist, it cannot get better than this.
The result? A shorter and more efficient screening process that could turn capsule endoscopy into an easily accessible and routine examination. Shorter specialist screening times would bring down costs in the private sector and lessen the burden on public health systems. Michael would not need to worry any longer; he’d just pop a pill.
* Michael is a fictitious character
[ct_divider]
The author thanks Prof. Thomas Attard and Joe Garzia. The research work is funded by the Strategic Educational Pathways Scholarship (Malta). The scholarship is part-financed by the European Union — European Social Fund (ESF) under Operational Programme II — Cohesion Policy 2007–2013, ‘Empowering People for More Jobs and a Better Quality of Life’
The ancients saw volcanoes as the wrath of their mighty gods. Volcanoes have been blamed for clearing whole towns, even planet-wide extinctions. A local team based in Gozo has just found out if Etna affects the Maltese Islands. Words by Dr Edward Duca.
I first heard about COST (European Cooperation in Science and Technology, a networking platform for scientists www.cost.eu) way back in 1996 during a pharmacokinetics meeting in Athens. Some participants mentioned that their attendance had been funded by COST. So on my return I contacted the Malta Council of Science and Technology to try and obtain more information. When I learnt that COST funds EU networking I quickly applied to become a member of a COST action (this is what COST calls a network). After bureaucratic leaps and bounds I become Malta’s representative on a COST action. It certainly opened new horizons to me and the networks I formed with top researchers in Europe were unique.
By 2010 my enthusiasm resulted in MCST nominating me as Malta national contact point for COST. It has been of huge satisfaction that in these three brief years Malta’s participation has risen from 6 actions to over 100. Over 150 Maltese researchers take part in COST.
Why is COST so important for Malta?
The complaint I hear most often in Malta, not only in academic circles but also among SMEs (small to medium enterprises), is that research in science is only for the elite, that it is too high brow and that it is not relevant to Malta. COST proves otherwise. What else could link disaster bioethics, to colour and space in cultural heritage to the comparison of European prostitution policies, with submerged prehistoric archaeology? Other links include the quality of suburban building stocks, integrated fire engineering and response, and language impairment in a multilingual society. COST also funds networks across a whole spectrum of research from the humanities to the fundamental sciences including string theory to childbirth in various cultures.
Participating in a COST action involves very simple administrative and funding procedures. For once, our small size is an added advantage since every COST country is allowed to nominate two members to participate in each action, putting Malta COST researchers at par with researchers from much larger countries. Achieving these results has not been easy, since many researchers hesitate and require persistent prodding. There are frequent reminders and one to one meetings to persuade them to participate. It has been a real eye-opener meeting researchers in Malta from different disciplines and learning about their research.
Deciding to participate in COST may seem a small step to some, an added administrative burden to others, while some see it as another travel commitment. COST offers the response to the conundrum of how to overcome our physical (and perhaps in some instances also mental) insularity. You should not let this opportunity pass…
We experience gravity everyday, but how it works is one of the biggest questions in physics. Einstein’s theory of relativity means that we don’t understand over 90% of the Universe. A team at the University of Malta is trying to put that in order.
I do not know what I may appear to the world, but to myself I seem to have been only like a boy playing on the seashore’, said the famous Isaac Newton. Humanity has progressed in its search for answers by always searching for the next smooth pebble, the next pretty shell. In Malta, a small group of students is trying to understand gravity through the observation of stars and galaxies that light up the night sky.
Gravity has kept our feet on the ground since we started walking upright. Early theories by the Greek philosopher Aristotle (384–322 bc) were interesting but far from the truth. His Universe was built in concentric spheres with Earth at the centre, followed by water, air, fire, and enclosed by the heavens — a rock fell to the Earth because it wanted to go to its original sphere. Clearly, he was wrong.
Aristotle’s concepts were challenged during the Renaissance when the Italian Galileo Galilei (1564–1642 ad) infamously dropped different weights from the tower of Pisa. Contrary to the Greek theory which stated that the heavier an object is, the faster it falls, Galileo saw the objects all fall at the same rate. Theories need to match observations, otherwise they fail — an invaluable technique used time and again by any decent scientist including the Malta group of astrophysicists led by Dr Kris Zarb Adami.
“Space is a dynamic entity ‘moving forward in time, the two being bound by light itself”
The first person to suggest a good theory for why rocks fall was Isaac Newton (1643–1727 ad). As the story goes, watching an apple fall triggered Sir Isaac Newton to come up with his theory of bodies. He said that anything with mass had a force that attracted everything towards it — the bigger the mass, the bigger the force. Since the apple is smaller than the Earth, it falls towards it, and since the Earth is smaller than the Sun, the Earth goes around the Sun. Newton’s law was successfully used to predict the motion of planets and helped discover Neptune.
By the 20th century, holes in Newton’s ideas started to appear when scientists discovered that Mercury’s orbit differed slightly from Newtonian predictions. In 1915, along came Einstein (1879–1955 ad) who again revolutionised our understanding of gravity through the introduction of his theory of general relativity. Newton had considered time and our three-dimensional space to be independent. Einstein replaced this with the notion of spacetime, which combines space and time into one continuous surface. Space is a dynamic entity ‘moving forward’ in time, the two being bound by light itself.
Large objects like the Sun bend the fabric of spacetime (it is convenient to think of spacetime as a sheet of fabric with balls lying on top of it — bigger balls curve the fabric more). Smaller objects (such as the Earth) try to follow the shortest route around the Sun. The shortest way is curved and it is easy to see how this comes about.
Consider the shortest route from the North Pole to the South Pole, you would naturally move down a curved longitude, which forms part of a circle round the Earth. This concept also explains why the Earth traces an orbit round the Sun. The orbit is the ‘best straight line’ that Earth can trace
in the curved spacetime surrounding the Sun. As John Archibald Wheeler neatly summarises it: ‘Spacetime tells matter how to move, matter tells spacetime how to curve’.
Einstein’s biggest blunder
Einstein’s theory of general relativity describes how gravity works. Einstein wanted his equations to represent a static Universe that did not change with time. To this end, he introduced a factor called the cosmological constant that would bring the Universe to a halt. However, this idea was short-lived. Another great (though highly egotistical) physicist called Edwin Hubble discovered that the Universe was expanding; this was confirmed in the late nineties and led to a Nobel Prize in 2011. It not only means that all matter will eventually disperse throughout the Universe and future generations will see only a blank night sky, but also poses a problem in that the reason for this expansion is completely unknown and unpredicted from Einstein’s theory. And it is not a small factor at all, since this mysterious energy makes up 68% of the energy in the Universe. Nicknamed ‘dark energy’ because it is unseen, this is the biggest problem in modern astrophysics and cosmology.
“If a star’s light is being bent by a galaxy, from Earth it will appear that the star’s light has changed, when in reality it would not have changed at all”
Scientists either have to accept that dark energy is true, or that Einstein’s model has met its limits and physics needs a new way to model gravity, at least on the largest of scales. The Malta astrophysics group is trying to verify and find new models of gravity — these so-called alternative theories of gravity. The idea is to compare observations to the different gravitational theories, including Einstein’s, and see which works best.
Our focus is split two-ways: one is the effect that celestial bodies have on each other’s orbital motion and the other is the bending of light around heavenly bodies. For example, our sun bends spacetime, causing the planets to go round it in ellipses. The sun also wobbles around a very small orbit. Observations show that the orbiting objects go round a bit longer than we would expect. The extra amount is miniscule, so measurements are taken after many orbits as this magnifies the effect. We use this as a possible test to disqualify alternative theories and have already shown how an important alternative theory of gravity cannot be true.
This is how fundamental science works. If a model does not match observations it needs to be modified to arrive at something that does give all the predictions we require. The end result must be a complete theory by itself but the different components could find their birth in a wide variety of unconnected sources.
The Malta astrophysics group considered a theory called conformal Weyl gravity that is similar to general relativity in every respect except one. This theory behaves exactly like Einstein’s but imposes a further constraint — mainly that the gravitational field remains the same no matter how much it is stretched or squeezed. Simply put, as long as the mass remains the same, gravity does not change. This assumption solves many problems. It makes dark matter and dark energy unnecessary. Dark matter is needed to explain the motion of stars in galaxies. Like dark energy, it is called dark because it cannot be seen or analysed in any way. Making them irrelevant would fill a gaping hole of knowledge for astrophysics.
When the group tested the Weyl theory, it gave the same result as general relativity and a small additional term. That was not a problem, since effects of this term were so small that they could not be observed with today’s largest telescopes. The problem, as shown by the Maltese astrophysics group, is that the term grows larger with distance and contradicts observations at the largest galactic scales. This was an important nail in the coffin for the Weyl theory of gravity and Einstein’s theory still remains the best model.
Our next step is to test other alternative theories of gravity by analysing how objects orbit each other. In the same way we disproved conformal Weyl gravity, we hope that these tests will help astrophysicists to eventually come closer to a model that correctly explains the cosmos.
Bending light
Gravitational Lensing is perhaps the most sensitive test of gravity on cosmological scales. To understand how it works, consider a lit candle and a wine glass. Imagine holding the wine glass and peering at the candle through the glass’ base. The flame will be distorted and changes shape. Now picture you are with a friend who stands a couple feet by your side. The flame will appear normal to them since they are seeing it from a different perspective and the light does not pass through the glass. Two people with a different point of view see different flame shapes. The wine glass’ base distorts the flame because it acts like a lens changing the direction light travels. Obviously in the Universe there are no wine glasses between the stars and the Earth but objects with huge masses like our sun or galaxies can act like a lens and bend the direction of light by the sheer force of gravity.
When there is no mass to affect it, light travels in straight lines, but insert a massive object and hey presto, the light deflects around it as if it were going through a curved glass lens. The area in which an object feels the gravitational pull of the Earth is called the Earth’s gravitational field. Each object in the Universe has a gravitational field and can therefore pull other objects towards it — like the Earth’s effect on the Moon, which keeps it in orbit.
Anything that enters an object’s gravitational field will feel a gravitational pull towards the center of the object. Imagine a ray of light traveling from a point to another with nothing in between. In this case the ray will travel in a straight line. Nevertheless, if the ray meets with an object along its way to the Earth, the object will pull the ray towards it as a consequence of the object’s gravity. Even though the ray of light will try to keep moving in a straight line, the gravity of the object is so strong that it bends the ray’s path. If a star’s light is being bent by a galaxy, from Earth it will appear that its light has changed, when in reality it would not have changed at all. This effect is called Gravitational Lensing and is currently one of the best tests for alternative theories of gravity, since one can measure the deflection of light and check whether it agrees with the theoretical predictions.
Extreme situations like the bending of light by galaxies cause problems for Einstein’s theory. When summing up the masses of the galaxies, we obtain the mass of the objects that are visible in the cluster. Comparing the predicted light deflection with the observed one, astronomers consistently find that the light is bent ‘more’ than is expected. The way to solve this issue is obvious. Introduce a completely invisible mass that increases the amount of bending until the predictions fit the observation: enter dark matter!
The idea of dark matter emerged a while ago. In 1933, Swiss astronomer Fritz Zwicky suggested it when studying how a galaxy rotation changes as one goes further away from the galaxy’s center. Zwicky observed that the speed or velocities predicted by Einstein’s theory should tear the galaxy apart. In reality, something must be keeping it whole. The idea of an invisible substance called dark matter was born.
Dark matter keeps the Universe together by opposing dark energy that pushes the Universe apart. Dark energy is related to the cosmological constant, previously discarded as Einstein’s biggest blunder, now reintroduced in astrophysicists’ equations to explain the accelerated expansion of the Universe.
The problem with dark matter is that it has never been seen. There is only indirect proof of its possible existence. Deandra Cutajar’s work focused on testing theories where no dark matter is needed. If true, this would put a small spanner into Einstein’s equations.
She tested two theories. They passed the first tests, but they have to pass many more to unseat Einstein’s general Relativity. Going back to the Swiss astronomer Zwicky, the two theories could explain why galaxies are not ripped apart by the speed with which they spin. Dark matter could be dead.
In another test, both theories failed to explain the extra gravitational effect observed in lensing. One theory failed miserably, while the other yielded less accurate results than Einstein’s general relativity. Dark matter is reborn; on the other hand, it cannot remain dark. It needs to be found and studied.
No theory of gravity has yet been found to beat Einstein’s equations. The explanation of how gravity works according to Einstein is better than Newton’s. A curved spacetime clearly explains why light is bent. Einstein’s theory of gravity still holds water and apart from the cosmological constant (his biggest blunder), he was right on most things. When his stunning prediction of how light can bend was observed, he replied, ‘I knew the theory was correct. Did you doubt it?’
What the future holds for any theory of gravity is uncertain, but what is definitely true is that the astrophysics group in Malta cannot accept the fact that we don’t understand 95% of the universe.
Assisted conception procedures arose as a type of treatment for infertility. They opened a whole new range of possibilities for couples that were unable to have children due to a variety of problems. Initially, the difficulty addressed was of blocked or absent fallopian tubes in women. This prevented the oocyte from making contact with sperm, hence preventing the formation of an embryo. Naturally, this also prevented an embryo from moving into the uterus, implanting itself, and developing into a foetus.
In vitro fertilisation bypasses tubes by obtaining oocytes from the ovaries and fertilising these oocytes outside the body (in vitro — in glass). The procedure became a reality in humans with the pioneering work of Steptoe and Edwards and the delivery of Louise Brown in 1974. She gave birth naturally in 1999.
“In our society, infertility is becoming more common and 8 out of 10 couples can experience problems”
With the further development of ICSI (Intra cytoplasmic Sperm Injection) it was possible to fertilise an oocyte (egg) with an individual sperm. This was a breakthrough therapy for men with low or absent sperm counts. When sperm are lacking in the ejaculate, a doctor can retrieve them directly from the testicles, or the epididymis (a tightly coiled tube from the testes to the rest of the body). The procedure is known as TESA or PESA. In combination with ICSI, these techniques made it possible for these men to father children.
In our society, infertility is becoming more common and 8 out of 10 couples can experience problems. This simple statistic makes these procedures increasingly important. Nowadays, even couples with the most severe problems can become parents.
These procedures have been mixed in controversy from the beginning, with most countries allowing science to proceed within certain safeguards. This restrained approach allows for progress.
Regrettably, infertility still carries a large stigma. The thousands who have benefited from these and other simpler infertility procedures (they precede attempts for assisted conception) do not speak out. Normally they don’t because of how society would perceive them or their children.
IVF essentially means that fertilisation of the oocytes occurs out of the body. The oocytes are then fertilised with sperm and in a percentage of cases this is successful and an embryo starts to develop.
This article continues the focus on IVF from last year’s opinion piece by Prof. Pierre Mallia. Other local experts have been contacted and we are open for further opinions and comments from our readers.
Over 100 million people suffer from depression. Prof. Giuseppe Di Giovanni talks about his life’s work on the brain chemical serotonin to find a new treatment for this debilitating disease that touches so many of us
The winter rays of sunlight reflected off the snow upon Mount Maiella and the beautiful Adriatic Sea. They lit up the room where I was sitting with Dr Ennio Esposito (head of the Neurophysiology unit, Mario Negri Sud, Italy). On this cold day in February the light was blinding and it was difficult to make out my long time friend and colleague. Together we had studied the brain chemicals serotonin and dopamine vital for love, pleasure, addiction, and linked to depression — my research subject.
‘Ennio, I am tired and frustrated, I am increasingly convinced that our in vivo (whole organism) experimental approach is not the right one. There is too much variability in the results and if we really want to understand the cause of depression and find a new cure we need to get some reproducible data and change our tactic.’
“We still do not understand how many psychoactive drugs actually work, meaning that more research is needed”
At that time, I was using glass electrodes to study changes in the electrical activity of single neurons in brains. Additionally, I used a technique (microiontophoresis) that registers neuron electrical activity and also applies a very small amount of the drug. In this way, I could see which brain cell was active and how different chemicals might influence it. Surprisingly, though introduced in the 1950s, these techniques are still some of the best ways to study drug effects on a living brain.
My research focuses on the role of two brain chemicals, dopamine and serotonin, in mental disorders. When stimulated neurons release chemicals (neurotransmitters). I am interested in dopaminergic neurons which release dopamine and serotonergic neurons that release serotonin. Once released, chemicals can pass through the spaces in between neurons and bind to another neuron stimulating or inhibiting it. They bind on proteins called receptors. When they do, they trigger the cell to fire or shut down. By triggering certain neurons in our brains, they reinforce or change our behaviour.
Dopamine is involved in the pleasure pathway. It switches on for behaviours like emotional responses, locomotion, and reinforcing good feelings. Changes in the level of dopamine effect a person’s reward and curiosity-seeking behaviour, like sex and addictive drugs. On the other hand, serotonin seems to have a more subtle role. One of serotonin’s major roles is to modulate or control the effects of other neurotransmitters, such as dopamine. In the words of Carew, a Yale researcher, ‘Serotonin is only one of the molecules in the orchestra. But rather than being the trumpet or the cello player, it’s the band leader who choreographs the output of the brain.’ The belief that serotonin is the brain’s ‘happy chemical’, that low serotonin levels cause depression and antidepressants work by boosting it is a very simplistic view. In truth, no one knows exactly how dopamine and serotonin levels induce depression.
“I have spent my life trying to figure out the role of dopamine and serotonin in the brain”
A lot of what we do know is because of animal research. The animals used to model this disease are given antidepressants to try and understand how effective they are and how they work. By studying their brains we can start to comprehend what causes depression. Right now we do not understand the whole picture behind the causes of depression and patients end up receiving inadequate treatment. We still do not understand how many psychoactive drugs actually work, meaning that more research is needed.
Most drugs were discovered by chance while being used to treat other disorders. For example, the antidepressant Iproniazid was originally developed to fight tuberculosis.
After the researchers saw less depression in patients suffering from tuberculosis they started prescribing it to depressed patients. In another example from the 1950s, clinicians discovered the first tricyclic antidepressant while searching for new drugs against other mental diseases.
Today, we fortunately have a battery of drugs that can treat depression. Unfortunately, the best drugs on the market only completely alleviate symptoms in 35 to 40 percent of patients compared to 15 to 20 percent taking a placebo (a sugar pill), a fact not publicised in pharmaceutical ads. Another problem is that when people begin taking antidepressants, mood changes can take four weeks or more to appear. This delay in action is one of the major limitations of these medications since it prolongs the impairments associated with depression, increases the risk of suicide, the probability that a patient stops treatment, and medical costs. To tackle these problems pharmaceutical companies and academic researchers want to find more effective and faster acting antidepressant drugs.
Ennio and I, together with Vincenzo Di Matteo and other researchers at the Mario Negri have tried to resolve the antidepressant lag time enigma by studying rats. We first inhibited the levels of serotonin for 3 weeks using the latest Selective Serotonin Reuptake Inhibitors (SSRIs) named fluoxetine, sertraline, and citalopram. Then we measured the electrical activity of dopamine and serotonin neurons in rat brains. We discovered that the therapeutic effect of antidepressants is not only due to their capacity to restore a normal level of serotonin activity. It also induces adaptive mechanisms in the dopaminergic system (that releases dopamine) because of repeated treatment.
How do SSRI’s treat depression? At first, these chemicals only slightly stimulate serotonin release. Long-term treatment kicks in an adaptive process. The receptor type located on serotonergic neurons which inhibit serotonin activity become insensitive. Repeated treatment frees serotonin neurons from this ‘brake’. By repeatedly using these drugs (with a lag time of 2–8 weeks), the levels of serotonin being transmitted increase and stay high for a longer time which is responsible for the SSRIs antidepressive effect.
The perfect antidepressant could lie in blocking the activity of these receptors since there would be no major delay in action. This hypothesis was confirmed by Francesc Artigas and his research group (University of Barcelona). They administered pindololo, a drug capable of blocking these serotonin receptors, and observed an increase of the antidepressive effect of the drugs paroxetine and fluvoxamine. They worked by reducing the latency period. Patients on pindololo did noticeably better and the clinical data matched that from laboratory animals. Blocking this type of serotonin receptors can be a promising therapy to reduce the latency period and possibly, increase antidepressant action.
My colleagues and I formed an alternative hypothesis as to why the clinical effects of drugs are delayed for so long focusing our attention on the dopaminergic system. We showed that acute administration of different SSRIs reduces the electrical activity of dopaminergic neurons, which release dopamine. These drugs increase the levels of serotonin, which decrease dopaminergic neuronal activity (which release dopamine) by over stimulating another inhibitory serotonin receptor this time located on dopaminergic cells. The result? The drugs taken to cure depression paradoxically initially induce a reduction of dopamine, which is meant to be the neurotransmitter of well-being and happiness! Indeed, SSRIs can worsen the depression of patients in the first few weeks of treatment.
When the drugs are used over a long period of time (3–4 weeks), the initial reduction of dopamine reverses. The change happens because the repeated treatment reduces the sensitivity of this type of serotonin receptor on dopaminergic cells freeing them from their serotonin ‘brake’.
“All of our work has made it possible to consider new treatments of depression”
We think we have found the reason why SSRI antidepressants take so long to work. Two different serotonin receptors need to become insensitive to the level of serotonin in the brain, one found on serotonergic cells, the other on dopaminergic cells. Their insensitivity allows the activity of dopaminergic neurons to return to normal even though the serotonin activity has been bumped up.
Other labs have confirmed our results, which is vital step for a theory to become fact. Cremer and his team (University of Groningen, Netherlands) have shown that blocking the same type of serotonin receptor on dopaminergic cells in rats can improve the effect of SSRIs antidepressants. Ultimately all of our work has made it possible to consider new treatments of depression, which I am very happy to see.
Many questions remain unanswered about depression. The most urgent task is to find a more effective way to treat it. This is my goal, I have spent my life trying to figure out the role of dopamine and serotonin in the brain — with some notable successes. I hope to see the next generation of antidepressants which would improve the life of 121 million depression sufferers.
Ennio listened to me as I expressed my frustration after once again obtaining conflicting results in the laboratory. ‘Giuseppe’ he said ‘You are right, billions of neurons in our brain behave differently, but as Douglas Adams said, ‘If you try and take a cat apart to see how it works, the first thing you have on your hands is a nonworking cat. Life is a level of complexity that almost lies outside our vision’ (Hitchhikers Guide to the Galaxy). If we want to break the code of the brain and hope to treat its diseases we need to take a holistic approach that takes the whole brain into account.
[ct_divider]
Article dedicated to the prominent researcher Dr Ennio Esposito, Prof. Di Giovanni’s (Department of Physiology and Biochemistry, UoM) colleague and friend. In 2011, he died of a heart attack. During his last years, he suffered from a severe refractory bipolar depression. If interested in an M.Sc. or Ph.D. in biological psychiatry please contact Prof. Giuseppe Di Giovanni
TED talk about targeted psychiatric medications, similarly to Prof. Digiovanni borne on the realisation that current treatments are not good enough for everyone
Biomedical Ph.D. student Alexandra Fiott (TV show and logistics) shares her thoughts on why the walls between the public and scientists need to come crumbling down.
Location: Merchant Street,
St James Cavalier, now located in the Biology Department to be permanently exhibited in the new medical wing at the University of Malta