Machine Learning – What Is It About?

A vast portion of the world’s society is experiencing the rising of technology. However, only a few people understand and internalize what it is. Artificial intelligence gives thanks to books and movies because each generation already created a form of fantasy in their perception. That is the ruling of worlds – or at least served – by multifunctional machines and robots.

Source: tehnot.com

Continue reading Machine Learning – What Is It About?

Blood Pressure Prediction Using Personal Data (Technological Mindfulness Discussion)

 

Source: defense.gov

 

Wearable technology has flooded the market this year. It seems like every tech company is giving its touch to the wearable device industry. The innovation is almost applicable to anything. There are wearable devices for fitness, health, entertainment, and even sports. It has excellent use for analytics and monitoring purposes. That is why the engineers of the University of California in San Diego developed a wearable device that is capable of predicting a person’s blood pressure. The technology uses the personal information of the user in creating a set of personalized recommendations to manage their blood health. This invention has predictive capability. It is considered a first of its kind in the market.

 

The Analyzed Research In Ground-Breaking Technology

This particular research of the UC team enabled them to win the Best Paper award at the Institution of Electrical and Electronics Engineers Healthcom 2018. According to the research team, it is the first ever device that can investigate blood pressure levels on a daily basis from the user’s analyzed and collected data. The predictive wearable device pinpoints the exact health behavior that needs modification to lower blood pressure levels. It gives the user a lot better goals and a to-do list to follow compared to an array of references. The structure of the device comes from the factors of sleeping behavior, healthy lifestyle, blood pressure levels, and daily exercise monitoring.

A University Of California School Of Engineering professor notes that patients are not too compliant with their doctors whenever they recommend a change of lifestyle for the benefit of their overall health. Sometimes, doctor’s recommendations are never taken too seriously unless it’s already a life and death situation. From that instance, the UC research conducted tests on how patients follow a particular medical order and approach. They concluded that it’s easier for an individual to depend on visual representation than to try to comply with some lifestyle changes from verbal recommendations.

 

Source: wikimedia.org

 

The First-Hand Testing

The study conducted by the UC team gathered data from eight volunteer patients for 90 days. Based on the research confirmed that personalized data is much more efficient than the general data gathered by almost all healthcare center databases. That’s because each person has a unique lifestyle choice, and the suggested data from most healthcare databases is very general and doesn’t pinpoint a specific recommendation. That is why the research has more positive results when it comes to lowering average systolic blood pressure by 15.4 percent and lowering diastolic pressure by 14.2 percent. Both results are in one-week use of the predictive wearable device.

The research shows that wearing wireless predictive devices is more systematic in handling blood pressure levels. It exceeds expectations compared to the conventional way of going to the doctor for a blood pressure rise checkup. With the wireless blood pressure monitor (such as FitBit charge HR or Omron Evolv), there is a created algorithm that predicts a person’s health parameters. The data accumulated from this device can help both the patient and the doctor adjust treatments and prescriptions without undergoing a lot of constraints. This technology will give the doctors access to increase the overall efficiency of monitoring blood pressure levels.

 

Source: pxhere.com

 

The device is still under development. Dey, the creator, and its team have partnered with a group of specialists from the University of California Health. They are bound to integrate the system into a greater sample size. The data gathered from this collaboration aims to increase the overall performance of the device and provide a basis for studying the long-term effect of health behaviors on blood pressure.

 

The Future Of Information Technology With Nanoscale Pillars

 

Source: wikimedia.org

 

Information technology in the modern day relies on electron charge and light as the primary medium for standard processing and data transfer. However, researchers from Linköping University and Sweden’s Royal Institute of Technology think that it is possible to process information at a much faster rate with the use of smaller materials for the device’s manufacturing purposes. The IT innovation says to bring an easy installation on today’s AI and smart devices with the consideration of low energy efficiency.

According to studies, the new device concept makes the processing and data transfer a lot faster due to a carried electron spin to light at room temperature. The idea is called “spintronics” or the exploitation of the charge and spin of electrons. The researchers are confident that the new device can efficiently gather data. This modern technique is as a giant step towards the future of information technology.

 

Concept Of Spintronics

The explained notion of spintronics is merely the picture of the Earth’s spinning motion (counterclockwise or clockwise) around its axis. The rotation of the planet Earth refers to a “spin up and spin down” state.

These two states are the representation of binary bits 1 and 0 in carrying out information. LED theoretically converts the gathered data from the spin states into the light, which is then carried by fiber optics. From that notion, the procedure enables the transfer of information over long distances. These methods of information transfer give way to the possibility of IT making use of both electron spin and light. Fortunately, it is already a type of technology called opto-spintronics.

 

Source: wikimedia.org

 

The principle that the spin state of the electron distinguishes from the characteristics of the emitted light is the primary basis of the information transfer in the opto-spintronics technology. The light in this information transfer is known as “chiral light” (a polarized light that mainly flows in a circular motion). It is responsible for the behavior of the electric field to move around on a counterclockwise or clockwise movement.  Its light direction connotes an optical activity that is beneficial for fast and accurate data transfer and processing. However, as advantageous as this process may sound, there is one problem that needs to be addressed.

 

It Still Needs Improvement

The electron loses the orientation of its spin and gets easily affected whenever there’s a rapid change in temperature. It tends to spin in random. That’s due to the element of the future spin-light application that somehow becomes too highly ineffective. It means that the information carried by the electron becomes too vague. So whenever it reaches the processing destination after its conversion to clear chiral light, the quantum information transfer at room temperature then becomes inaccurate and insufficient.

But the good thing about the project is that researchers incorporate an efficient “spin light interface” in the electron’s behavioral process. The new interface does not only maintain the electron spin, but it can also boost the electron spin signals at typical room temperature. Therefore, there’s a retained directional orientation. From there, data information is concealed from its corresponding chiral light signals that support its travel in a designated direction.

 

Source: wikimedia.org

 

The proposed device that can enhance the spin signal is due to the little defects introduced to the material. According to the researchers, the less than one out of one million gallium atoms are displaced from their original lattice sites, the better. The remaining impurities in the material act as active spin filters that can drain electrons with undesirable spin behavior and preserve those electrons with desirable spin orientation. The principal element of a nanopillar design is due to light’s fast and accurate transfer.

 

The Energy-Efficient Neuromorphic Computing

 

Source: defense.gov

 

The invention of the computer has pushed humanity to new heights. Back in the day, it was impossible to handle data gathering and calculations in the most intricate form. But today, everything is possible with the use of a single computer.

The revolution of data processing and computing is still on the rise without any signs of slowing down. That’s because computer technology handles millions of entry every day. Its usage brings people near accuracy, convenience, and low effort. However, most computers are not energy efficient. But thanks to the new research from Binghamton University (a State University of New York), energy efficient computers are now possible.

 

The Developmental Start

A lot of devices depend on Wi-Fi for connectivity, and somehow, this possesses a lot of disadvantages for the end user. That is why Louis Piper (an associate professor of physics and director of material science and engineering at Binghamton University New York) is trying to develop an energy-efficient computer that allows a device’s total responsiveness. It focuses on a structure that stops devices from losing WIFI connectivity and enables them to stick with the primary computer system.

According to Professor Piper, there are two ways to address the problem of device connectivity and data processing. It is either addressing it through a signal connectivity level or accessing it from the hardware level (which is what his team is currently doing). The main idea of their project is to create a set of chips that are capable of handling computing on their own rather than doing a back-and-forth communication with a more massive computer machine or server.

 

Source: flickr.com

 

The Complexity Of The Creation

Modern research is trying to replicate the functions of the neurons in a human brain. And through this continued research, scientists have finally created neuristors. Neuristors are circuits that act at the same level as biological neurons. It can do complex computations with the use of minimal power. Part of a neuristor is made up of niobium dioxide, which can act similarly to the switching action of ion channels.

Though the study is currently progressing, these components are very difficult to fabricate. It is complicated, elaborate, and time-consuming. The parts require a companion capacitor to be able to function within the neuristor circuit. Its overall complexity process is challenging because the fabrication stage of the niobium dioxide needs a bolt of electricity for it to be created.

Seemingly, the specifications of the process are not how engineers do the fabrication, and that’s not how silicon transistors get incorporated into its manufacturing as well. That is why in the study proposed by the Georgia Tech team in creating NB2O5x based devices, they are not going to inject an additional pulse of energy. This method was verified by Piper’s team and even said that it could lead to an inexpensive and energy efficient neuristor circuit.

The team aims to create the devices in a traditional way of fabrication without undergoing the required steps in manufacturing. However, current data supports that the possibility of creating highly efficient devices might deviate from the original manufacturing technique. According to the data from Binghamton, Piper’s team is trying to model the neuristor from a more atomic perspective. Therefore, they don’t want to compromise the nature of every material and give the device the capability to evolve during its operational process.

Source: army.mil

 

Technology is getting smaller and smaller while increasing its effectivity. Gone are the days of gigantic computers handling thousands or millions of data processes. The future of technological advancement is pointing towards multiple data computing where it welcomes a single device that can compute, manage, retain, and process all information in a split second.

TMDCs – The Future Of Computer Processing

 

Source: flickr.com

 

The sudden surge of data to servers and computers represents the emergence of smart devices, Artificial intelligence (AI), and the development of the Internet of Things (IoT). Since traditional computers can no longer process efficiently the millions of data that tend to feed their system every day, science and technology have found a way to do it. The discovery of TMDCs or Transition metal dichalcogenides helps in developing computer systems that are capable of running and storing information a million times faster, safer, and convenient. It is due to its optical properties that enable the computer to function at its peak while managing its energy consumption efficiently.

The Functioning Computer System

According to the Georgia State University research team, the use of transition metal dichalcogenides for the basis of computer construction improves the operation of the computer up to femtosecond (a one quadrillionth time scale). Since traditional computers can only operate at a time scale of a fraction of nanosecond, TMDC gives the system at ten times speed and accuracy. With the help of this technology, the computer memory capability also increases up to a million fold.

The transition metal dichalcogenides physical structure has a hexagonal lattice (a motif for 2D representation). It is made up of combined panels of transition metal atoms in between two chalcogen atoms. The physical structure of TMDCs enables the computer processor to function faster and store information efficiently.

Dr. Mark Stockman, the director of the Center for Nano Optics and the lead author of the research, stated that the only thing faster than the component is light. The Physics and Astronomy professor added that instead of focusing on computer electronics alone, the integration of optics technology into the system should become the basis for an upgrade. That’s because the current electronics used in conventional computers already reached their limits and can’t go faster any longer. However, the only solution to answer the speed problem is to add more processors that can both increase the cost of operation and maintenance at the same time. That is why Stockman and his team are pushing forward the use of TMDCs to allow computer function a million times faster, even if it’s a different approach to the field of information technology.

 

Source: wikimedia.org

 

Under Development

Regardless of its stand as of the moment, the concept of TMDCs is still a theory and under development. Researchers from Georgia State propose that TMDC’s potential to process millions of data is only possible for a couple of femtoseconds and might shut down after reaching its peak. Though there are things that need to get done, the organized approach to computer systems computing mechanism somehow shows possible great results.

Not only do TMDCs possess superior functionalities than their electronic counterparts, but it is also mechanically strong, stable, non-toxic, and light. The optical properties of TMDCs allow them to function at an ultrafast level. To give a graphic representation of how the TMDCs function, picture the hexagonal lattice structure of the transition metal dichalcogenides and imagine the electrons rotating in a circular path in varying states. This movement results in a new effect called topological resonance that allows the TMDCs to function at femtoseconds.

 

Source: maxpixel.net

 

With the discovery of the superior qualities of TMDCs, the researchers from Georgia State envision a future where computers can work efficiently at an ultrafast rate without ever consuming a lot of power in the process. The development of high-quality electronic material promises a better computer system. Therefore, the future of computing is positively heading to the optics technology.

Electronics can only do so much, and its efficiency is minimal, but optics provides a lot of upsides for the computing technology.

 

Brain-Inspired Computer Architecture For Data Handling

 

Source: pixabay.com

 

The number of data that gets processed today is different from the information fed to old computers a long time ago. Today, everything is digitally available, and as the increased digital data has flooded the present computer system, it is inevitable for it to experience hiccups upon processing. That’s because the current computer system cannot handle the abundance of data that needs consistent handling. That is why IBM researchers have been developing the technology of upgraded computer architecture with a different approach so that it can sustain both the previous and present design.

Results show that the new computer architecture outperforms the conventional computer systems of today’s technology. The Artificial Intelligence technology of computer architecture has found its way towards the mainstream market. The system becomes efficient in handling, storing, and processing elaborate data information from different sources. That’s because the design of the new system mimics the various concepts of a human brain, which by the way is still the ultimate data processor up to date.

 

Going Deep With Its Development

During the 1940s, a computer architecture called “von Neumann” was built, and since then, it has become the basis of every computer system. Its structure includes the following essential components. There’s a central processor that acts as the brain of the system where it enables arithmetic and logic executions. There’s also a storage and memory unit that is useful for storing contents and data from external sources. There are a couple of I/O devices (input and output) that are used to enter data into the system. These are the generic parts of a computer system.

 

Source: maxpixel.net

 

However, the researchers from IBM suggested creating a much more efficient system that can achieve 200 times faster performance than its conventional counterpart. From those functions, they focused on a brain-like computer with processing and memory components that coexist.  To move forward, researchers based their brain-like architecture design on three specific levels of the human brain concept.

  • The first level is to make use of the device memory state dynamics. It is the same as to how a human brain performs tasks where it involves basic up to complicated recollection of information. The processing of the data happens in the memory function itself. In a human intellect, both the processing and memory occurs in the same location.
  • In the second level, the concept was drawn out from the brain’s synaptic network structures, where the phase changes ideas of its memory components. This concept was used to accelerate the enhancement of deep neural networks. It allows the computer to make an accurate response and have significant numerical integration.
  • Finally, the stochastic nature of neurons and synapses of the brain inspires the third level of the system. Through this concept, the researchers were able to assemble a superior computational substrate beneficial for boosting neural networks and allowing the system to have digital-based critical thinking. This phenomenon is characterized by analog storage that resembles biological synapses same as the brain, where it stores a lot of data in only one Nanoscale device.

According to the IBM research team, the new computer design bounds to leap a better function than the conventional computing system. The researchers made sure that the new architecture is a good translation of the human brain function. Thus, it will potentially manage, handle, store, and process millions of data without any hassle. And even during the conceptual stage, they already predicted that the new architecture would exceed users’ expectations.

 

Source: pexels.com

 

The development of computer architecture is still underway, but the design of the latest innovation is way better than the conventional system. It can process without hiccups flooding information from AI systems, giving it the power to work accurately and quickly.

 

Algorithm With The Ability To Find Antibiotic Candidates

 

Source: pixabay.com

 

It seems that technology has improved in making medicines over the years. However, viral diseases have also evolved into a more formidable opponent as technology rises. That is why an international research team has applied an idea in restructuring pharmaceutical methods. Right now, the industry is focusing on the source first before creating a cure in search for a new pharmaceutical technique.

The field of pharmaceuticals is one of the wealthiest industries in the world. Though we all know there are still a lot of viral diseases that are yet to be cured by antibiotics, researchers from different prestigious Universities (such as Carnegie Mellon University, the University of California in San Diego, and St. Petersburg in Russia) gathered for a proposition. They proposed a modern means of searching big repositories of compounds emanating from microbes.

The analysis of the mass spectra was able to pinpoint the known elements inside the repository using the information gathered from the reviewed compounds (microbes). It tends to focus on the unknown parameters that can potentially create a much more efficient type of antibiotic or even anticancer medicine. The researchers named the algorithm – Dereplicator. It is the process of capably finding an excellent efficiency of antibiotic candidates from different elements and compounds.

 

Source: wikimedia.org

 

Testing The Algorithm

Before the use of mass spectrometry, data repositories were put into halt because of its complexity. Not to mention the high cost needed to push through the process, including the hit-or-miss efficiency of rediscovering known compounds. However, the researchers still test the Dereplicator for a week with 100 running computers. From that test, the algorithm sorted through a billion mass spectra in the Global Natural Products Social molecular network at the University of California. They primarily measure the masses within a sample that seems ionized. Positively, it had identified 5000 plus unknown compounds that need to undergo a more thorough investigative process.

After the breakthrough, the researchers made sure that the algorithm was available for use to any investigator who wanted to study additional repositories. The aim is to test the effectivity of the algorithm and gather opinions throughout the pharmaceutical industry. However, according to some researchers, analyzing the compound’s mass spectra is an inexpensive way of recognizing the new type of drug. Although the method used by some was only limited to peptides, it is solely composed of simple structures like loops and chains.

The unknown structures of the gathered result were tested to analyze the variety of complex compounds. The group of researchers from the collaborating universities created a technique for predicting how a mass spectrometer would break molecules. The method started with the weakest rings. From there, it gets simulated as the molecules get placed into a breaking process. An AI technology was integrated into the creation of an algorithm, using 5000 specific known compounds and their mass spectra. They made a computer model that can predict the process of the breakdown of other unknown compounds.

According to the lead researcher Mohimani, the algorithm and biotechnology of Dereplicator do not only recognized known compounds that don’t need any further investigation, but it can also identify “not so common” variants of several known compounds that are likely to go undetected in the process.

 

Source: pexels.com

 

With this new algorithm that is seemingly already available, new ventures in the field of pharmaceuticals become wide open. It will create more efficient progress in the creation of suitable and effective drugs that aid a lot of diseases. And it’s not just antibiotics — algorithms, and the future of AI in mental health crisis support mean that computers might be able to help us with a broad range of illnesses, not just medicine. From there, people and healthcare providers can somehow conclude that the Dereplicator algorithm can become a gateway to a more efficient compound discovery and rediscovery process.