Transforming The Educational Landscape For 21st Century Learners Using Technology

The 2019 Technology Invention Conference held last December at Bangkok, Thailand, featured some of the most groundbreaking researches, inventions, and innovations in the country. Approaches from various fields such as statistics, biotechnology, chemistry, and marine science, among others, were explored in the studies presented. The goal? Restoring the deteriorating ecosystem.

Incorporating Technology And Education
Such conferences underscore the importance of incorporating modern technology not only in higher education institutions and academic research but also in primary and secondary education centers. After all, those are the places where learning and development begin. Given that the world is in constant flux, humans, with the need and desire to make life better, continue to further scientific and technological breakthroughs. They become global citizens whose ability to access, understand, and control technology can define whether they’ll succeed in their pursuits or not.

Source: pixabay.com

With the explosion of information and available sources of knowledge, education is one of the sectors that saw massive and rapid changes from the technological revolution. Earlier generations of students who flocked libraries have now been replaced by a breed of learners who can access knowledge at any time and place. Big data can now be processed by statistical software in seconds. Mechanical and architectural designs can be designed with digital tools, and, including its motion and composition, can be analyzed with ease.

Making The Transition
Expectedly, learning institutions had to change their instructions accordingly to apply online and digital methods of teaching since there’s no doubt there’s a lot of benefits of technology. Some have already required their pupils to transition to digital modes of studying in their tablets or laptops. This is the case for private schools and students who have purchasing capacity.

Numerous discussions between parents and teachers were conducted pre-roll out. These discussions focus on the trade-offs between efficient and lightweight technologies versus the expenses, security and safety issues, and compromised attention that it entails. Unfortunately, this shift widens the gap for students who have little to no resources.

Source: pixabay.com

We are at a time when possibilities are endless. Thanks to the iterative process of education and society, individuals continue to experience, ideate, prototype, test, and inspire. Learning is now literally at your fingertips. It is up to the knowledgeable and capable individuals, as well as the society, to promote digital literacy and equip students with the resources they need. Then, the youth can hopefully find the cure to the world’s most complex problems.

Will Technology Ruin Your Children’s Mental Health Development? Psychologist Says Yes

Source: pixabay.com

Today, technology has been evolving and is becoming easier to use that even children at a very young age will have no problem navigating it. There are also apps and other entertainment things that can be deemed as fun or educational, but what we don’t know is that this can be addicting too. Something too much (hours upon hours watching TV, playing video games, and using tablet devices) is a bad thing for our mental health, according to a psychologist.

Continue reading Will Technology Ruin Your Children’s Mental Health Development? Psychologist Says Yes

Blood Pressure Prediction Using Personal Data (Technological Mindfulness Discussion)

 

Source: defense.gov

 

Wearable technology has flooded the market this year. It seems like every tech company is giving their touch in the wearable device industry. The innovation is almost applicable to anything. There are wearable devices for fitness, health, and entertainment and even in sports. It has the excellent use for analytics and monitoring purposes. That is why the engineers of the University of California in San Diego developed a wearable device that is capable of predicting a person’s blood pressure. The technology uses the personal information of the user in creating a set of personalized recommendation to manage their blood health. This invention has predictive capability. It is considered a first of its kind in the market.

 

The Analyzed Research In Ground-Breaking Technology

This particular research of the UC team enabled them to win the Best Paper award at the Institution of Electrical and Electronics Engineers Healthcom 2018. According to the research team, it is the first ever device that can investigate blood pressure levels on a daily basis from the user’s analyzed and collected data. The predictive wearable device pinpoints the exact health behavior that needs modification to lower blood pressure levels. It gives the user a lot better goals and to-do list to follow compared to an array of references. The structure of the device comes from the factors of sleeping behavior, healthy lifestyle, blood pressure levels, and daily exercise monitoring.

A University Of California School Of Engineering professor notes that patients are not too compliant to their doctors whenever they recommended a change of lifestyle for the benefit of their overall health. Sometimes, doctor’s recommendations are never taken too seriously unless it’s already a life and death situation. From that instance, the UC research conducted tests on how patients follow a particular medical order and approach. They concluded that it’s easier for an individual to depend on visual representation than try to comply with some lifestyle changes from verbal recommendations.

 

Source: wikimedia.org

 

The First-Hand Testing

The study conducted by the UC team gathered data from eight volunteer patients for 90 days. Based on the research, it confirmed that personalized data is much more efficient than the general data gathered by almost all healthcare center databases. That’s because each person has a unique lifestyle choice, and the suggested data from most healthcare database is very general and doesn’t pinpoint a specific recommendation. That is why the research has more positive results when it comes to lowering average systolic blood pressure by 15.4 percent and lowering diastolic pressure by 14.2 percent. Both results are in one week use of the predictive wearable device.

The research shows that wearing wireless predictive devices is more systematic in handling blood pressure levels. It exceeds expectation compared to the conventional way of going to the doctor for a blood pressure rise checkup. With the wireless blood pressure monitor, (such as FitBit charge HR or Omron Evolv) there is a created algorithm that predicts a person’s health parameters. The data accumulated from this device can help both the patient and the doctor to adjust treatments and prescriptions without undergoing a lot of constraints. This technology will give the doctors access to increase the overall efficiency of monitoring blood pressure levels.

 

Source: pxhere.com

 

The device is still under development. Dey, the creator, and its team has partnered with a group of specialist from the University of California Health. They are bound to integrate the system into a greater sample size. The data gathered from this collaboration aims to increase the overall performance of the device and provide a basis of study for the long-term effect of health behaviors on blood pressure.

 

 

 

The Energy-Efficient Neuromorphic Computing

 

Source: defense.gov

 

The invention of the computer has pushed humanity to new heights. Back in the days, it was impossible to handle data gathering and calculations in the most intricate form. But today, everything is possible with the use of a single computer.

The revolution of data processing and computing is still on the rise without any signs of slowing down. That’s because computer technology handles millions of entry every day. Its usage brings people near accuracy, convenience, and low effort. However, most computers are not energy efficient. But thanks to the new research from Binghamton University (a State University of New York), energy efficient computers are now possible.

 

The Developmental Start

A lot of devices depended on Wi-Fi for connectivity, and somehow, this possesses a lot of disadvantages for the end user. That is why Louis Piper (an associate professor of physics and director of material science and engineering at Binghamton University New York) is trying to develop an energy efficient computer that allows a device’s total responsiveness. It focuses on a structure that stops devices from losing WIFI connectivity and enables it to stick with the primary computer system.

According to Professor Piper, there are two ways to address the problem of device connectivity and data processing. It is either addressing it through a signal connectivity level or accessing it from the hardware level (which is what his team is currently doing). The main idea of their project is to create a set of chips that are capable of handling computing on its own rather than doing a back and forth communication with a more massive computer machine or server.

 

Source: flickr.com

 

The Complexity Of The Creation

Modern research is trying to replicate the functions of the neurons in a human brain. And through this continued research, scientists have finally created neuristors. Neuristors are circuits that act at the same level as biological neurons. It can do complex computations with the use of minimal power. Part of a neuristor is made up of niobium dioxide, which can act similarly as the switching action of ion channels.

Though the study is currently progressing, these components are very difficult to fabricate. It is complicated, elaborate, and time-consuming. The parts require a companion capacitor to be able to function within the neuristor circuit. Its overall complexity process is challenging because the fabrication stage of the niobium dioxide needs a bolt of electricity for it to be created.

Seemingly, the specifications of the process are not how engineers do the fabrication, and that’s not how silicon transistors get incorporated in its manufacturing as well. That is why in the study proposed by the Georgia Tech team in creating NB2O5x based devices, they are not going to inject an additional pulse of energy. This method was verified by Piper’s team and even said that it could lead to an inexpensive and energy efficient neuristor circuit.

The team aims to create the devices in a traditional way of fabrication without undergoing the required steps in manufacturing. But current data supports that the possibility of creating highly efficient devices might deviate the original manufacturing technique. According to the data from Binghamton, Piper’s team is trying to model the neuristor in a more atomic perspective. Therefore, they don’t want to compromise the nature of every material and give the device the capability to evolve during its operational process.

Source: army.mil

 

Technology is getting smaller and smaller while increasing its effectivity. Gone are the days of gigantic computers handling thousands or millions of data processes. The future of technological advancement is pointing towards multiple data computing where it welcomes a single device that can compute, manage, retain, and process all information in a split second.

 

 

Brain-Inspired Computer Architecture For Data Handling

 

Source: pixabay.com

 

The number of data that gets processed today is different from the information fed to old computers a long time ago. Today, everything is digitally available, and as the increased digital data has flooded the present computer system, it is inevitable for it to experience hiccups upon processing. That’s because the current computer system cannot handle the abundance of data that needs consistent handling. That is why IBM researchers have been developing the technology of upgraded computer architecture with a different approach so that it can sustain both the previous and present design.

Results show that the new computer architecture outperforms the conventional computer systems of today’s technology. The Artificial Intelligence technology of computer architecture has found its way towards the mainstream market. The system becomes efficient in handling, storing, and processing elaborate data information from different sources. That’s because the design of the new system mimics the various concepts of a human brain, which by the way is still the ultimate data processor up to date.

 

Going Deep With Its Development

During the 1940s, a computer architecture called “von Neumann” was built and since then, has become the basis of every computer system. On its structure includes the following essential components. There’s a central processor that acts as the brain of the system where it enables arithmetic and logic executions. There’s also a storage and memory unit that is useful for storing contents and data from external sources. There are a couple of I/O devices (input and output) that is used to enter data to the system. These are the generic parts of a computer system.

 

Source: maxpixel.net

 

However, the researchers from IBM suggested creating a much more efficient system that can achieve 200 times faster performance than its conventional counterpart. From those functions, they focused on a brain-like computer with processing and memory components that coexist.  To move forward, researchers based their brain-like architecture design from a specific three levels of the human brain concept.

  • The first level is to make use of the device memory state dynamics. It is the same as to how a human brain performs tasks where it involves basic up to complicated recollection of information. The processing of the data happens in the memory function itself. In a human intellect, both the processing and memory occurs in the same location.
  • In the second level, the concept was drawn out from the brain’s synaptic network structures where the phase changes ideas of its memory components. This concept was used to accelerate the enhancement for deep neural networks. It allows the computer to make an accurate response and have significant numerical integration.
  • Finally, the stochastic nature of neurons and synapses of the brain inspire the third level of the system. Through this concept, the researchers were able to assemble a superior computational substrate beneficial for boosting neural networks and allow the system to have digital-based critical thinking. This phenomenon is a characterized analog storage that resembles biological synapses same as the brain where it stores a lot of data in only one Nanoscale device.

According to the IBM research team, the new computer design bounds to leap a better function than the conventional computing system. The researchers made sure that the new architecture is a good translation of the human brain function. Thus, it will potentially manage, handle, store, and process millions of data without any hassle. And even during the conceptual stage, they already predicted that the new architecture would exceed users’ expectations.

 

Source: pexels.com

 

The development of computer architecture is still underway, but the design of the latest innovation is way better than the conventional system. It can process without hiccups flooding information from AI systems, giving it the power to work accurately and quickly.

 

 

Algorithm With The Ability To Find Antibiotic Candidates

 

Source: pixabay.com

 

It seems that technology has improved in making medicines over the years. However, viral diseases have also evolved into a more formidable opponent as technology rises. That is why an international research team has applied an idea in restructuring pharmaceutical methods. Right now, the industry is focusing on the source first before creating a cure in search for a new pharmaceutical technique.

The field of pharmaceuticals is one of the wealthiest industries in the world. Though we all know there are still a lot of viral diseases that are yet to be cured by antibiotics, researchers from different prestigious Universities (such as Carnegie Mellon University, the University of California in San Diego and St. Petersburg in Russia) gathered for a proposition. They proposed a modern means of searching big repositories of compounds emanating from microbes.

The analysis of the mass spectra was able to pinpoint the known elements inside the repository using the information gathered from the reviewed compounds (microbes). It tends to focus on the unknown parameters that can potentially create a much efficient type of antibiotics or even anticancer medicine. The researchers named the algorithm – Dereplicator. It is the process of capably finding an excellent efficiency of antibiotic candidates from different elements and compounds.

 

Source: wikimedia.org

 

Testing The Algorithm

Before the use of mass spectrometry, data repositories were put into halt because of its complexity. Not to mention the high cost needed to push through the process, including the hit or miss efficiency of rediscovering known compounds. But the researchers still test the Dereplicator on a week with 100 running computers. From that test, the algorithm sorted through a billion of mass spectra in the Global Natural Products Social molecular network at the University of California. They primarily measure the masses within a sample that seems ionized. Positively, it had identified 5000 plus unknown compounds that need to undergo a more thorough investigative process.

After the breakthrough, the researchers made sure that the algorithm is available for use to any investigator that wants to study additional repositories. The aim to tests the effectivity of the algorithm affects and gathers opinion all throughout the pharmaceutical industry. But according to some researcher, analyzing the compounds mass spectra is an inexpensive way of recognizing the new type of drugs. Although the method used by some were only limited to peptides, it is solely composed of simple structures like loops and chains.

The unknown structures of the gathered result were tested to analyze the variety of complex compounds. The group of researchers from the collaborating universities created a technique for predicting how a mass spectrometer would break molecules. The method started with the weakest rings. From there, it gets simulated as the molecules get placed into a breaking process. An AI technology was integrated into the creation of algorithm, using 5000 specific known compounds and their mass spectra. They made a computer model that can predict the process of breakdown of other unknown compounds.

According to the lead researcher Mohimani, the algorithm and biotechnology of Dereplicator do not only recognized known compounds that don’t need any further investigation, but it can also identify “not so common” variants of several known compounds that are likely to go undetected in the process.

 

Source: pexels.com

 

With this new algorithm that is seemingly already available, new ventures in the field of pharmaceutical become wide open. It will create more efficient progress in the creation of suitable and effective drugs that aids a lot of diseases. From there, people and health care providers can somehow conclude that the Dereplicator algorithm can become a gateway to more efficient compound discovery and rediscovery process.