Tata Consultancy Services Says 3-D Printing Disruption in Manufacturing Will Soon Be Realized

Click here to read the entire article: Quality News Today.

U.S. Tech Firms to Take Hit From Huawei Sanctions

Agence France-Presse Top News

May 24, 2019

The tough sanctions imposed on Huawei by President Donald Trump could deal a blow to the many U.S. firms that make up the Chinese tech giants supply chain.


American firms last year sold an estimated $11 billion worth of components to Huawei, which was put on a blacklist last week by Washington over national security concerns as trade frictions grow between the U.S. and China.


Trumps executive order could effectively ban makers of U.S. hardware and software from selling to Huawei by requiring a special license from Washington.


The Commerce Department on Monday delayed the sanctions on Huawei for 90 days, saying the additional time was needed to allow for software updates and other contractual obligations.


The agency said it was granting Huawei a temporary general license through August 19 allowing for transactions necessary to maintain and support existing and currently fully operational networks and equipment, including software updates and patches, subject to legally binding contracts and agreements signed before May 16.


Hardware and software


Bloomberg News reported that U.S.-based chipmakers Intel, Qualcomm, Broadcom and Xilinx have indicated they would halt shipments to the Chinese firm which is the worlds number two smartphone maker and a leader in telecom infrastructure and super-fast 5G networks.


Google said it would comply with the U.S. order, leaving Huawei without access to critical services for the Android operating system such as Gmail and Google Maps.


Microsoft, which supplies the Windows operating system for many Huawei devices, did not respond to an AFP query on how the order might impact the Redmond, Washington-based firm.


Bob ODonnell of the consultancy Technalysis Research said any ban would almost certainly affect Microsoft.


If it affects Google I dont see why it wouldnt affect Microsoft, ODonnell said.


Any version of Windows comes from Microsoft, since there is no open-source version.


Moving toward independence


Roger Kay, founder and analyst at Endpoint Technologies Associates, said the ban is likely to accelerate efforts by Huawei and other Chinese firms to develop their own sources of microprocessors and other components.


The short-term effect on both American and Chinese companies are inevitably negative, Kay said.


The longer-term effect is that Huawei and other Chinese companies turn away more sharply from American suppliers.


Neither Intel nor Qualcomm responded to queries on how they would respond to the order on Huawei.


Avi Greengart, founder of the research firm Techsponential, said a ban on sales to Huawei could hit a wide range of large and small U.S. firms including Corning, which makes the popular Gorilla Glass for smartphones, and Dolby, a producer of video and audio software for handsets.


When you think about all the software and hardware components you get a pretty big list, Greengart said.


The U.S. is a big part of the global supply chain.


Few firms offered public comments on their response to the Huawei executive order.


But one, California-based Lumentum Holdings, a maker of optical and laser applications, said it would comply with the executive order and that Huawei accounted for 15% of its revenue so far in the current fiscal year.


Risks to Apple


Greengart said Apple could also suffer from any protracted crisis over Huawei, estimating the iPhone maker gets about 17% of its revenues from China.


Even though Apple might benefit in the premium smartphone market in Europe, I think the risks are higher than the rewards for Apple, Greengart said.


If there is a backlash against Apple in China, that could have damaging long-term effects.


Greengart said that Google might not see a major impact for the moment.


Ironically (the ban) wont affect Google much because Google doesnt make money selling Android.


Patrick Moorhead, of Moor Insights & Strategy, said he sees a limited impact on U.S. firms in the short run.


The impact to the U.S. companies depends on the length of the ban but also how indexed they are in sales to Huawei, Moorhead said.


Neither Intel, Google or Nvidia do more than 3% of their business with Huawei, so short-term, it shouldnt be an issue.


ODonnell said a bigger risk is that Huawei and other Chinese firms step up efforts to develop software and hardware that allows them to break free from Silicon Valley.


The longer-term question is: does this drive Huawei to develop a third mobile platform? ODonnell said.


China is already developing its own technology infrastructure, and this plays into the whole notion of a separate internet in China, which would be a big deal.


Copyright 2019 Agence France Presse All Rights Reserved.

Copyright © LexisNexis, a division of Reed Elsevier Inc. All rights reserved.  
Terms and Conditions   
Privacy Policy

Quality News Today is an ASQ member benefit offering quality related news
from around the world every business day.

Click here to read the entire article: Quality News Today.

A SpaceX Contractor Risked Disaster by Forging Safety Inspectors’ Signatures for Falcon 9 and Falcon Heavy Rocket Parts, U.S. Prosecutors Say

The Business Insider

May 24, 2019

By Alexandra Ma


A SpaceX contractor forged signatures on inspection reports for Falcon 9 and Falcon Heavy rocket parts, potentially endangering the missions, the U.S. Department of Justice (DOJ) alleged on Wednesday.


Officials arrested James Smalley, a quality assurance engineer, on Wednesday, and charged him with falsifying at least 38 inspection reports for SpaceX vehicle parts, the DOJ said in a press release.


The parts were flight-critical, and had been due to be used in Falcon 9 and Falcon Heavy launches, the DOJ said.


A SpaceX source told Business Insider that its internal systems identified the problem with the safety documentation, which eventually led to the DOJ prosecution.


Smalley worked for PMI Industries, which is based in Rochester, New York, and specializes in machinery for flight critical aerospace parts. These were used to build vehicles for SpaceX and other Department of Defense aerospace contractors, the DOJ said.


According to the Rochester Democrat and Chronicle newspaper, Smalley told investigators that he forged the signatures of at least three investigators from SQA Services, a SpaceX quality assurance subcontractor, and illicitly used their quality stamps.


The aerospace industry uses audits to ensure the quality and safety standards of manufactured parts, in the hope of preventing faulty parts causing accidents during flight.


Smalley is believed to have photocopied the inspectors signatures, the DOJ said. He then copied and pasted the signatures and quality stamps onto the falsified reports with his company-issued laptop, according to the Democrat and Chronicle which, cited the criminal complaint against the engineer.


When one of his bosses asked Smalley why he allegedly falsified the reports, Smalley said it was because he wanted to ship more product for the company, the Democrat and Chronicle reported, citing the complaint.


SpaceXs business with PMI was worth around $200,000 a month, the DOJ said.


Smalley is due to appear before a judge at a courthouse in Rochester, part of the Western District of New York, at 1 p.m. on Thursday.


SpaceX had used parts purchased from PMI in seven NASA space flight missions, two U.S. Air Force space flight missions, and one National Oceanic and Atmospheric Administration space flight mission, the DOJ reported, citing a SpaceX records request.


Its not clear whether any of those missions used parts with falsified reports.


SpaceX ended its contract with PMI shortly after those flights, the DOJ said, and PMI shut down operations as a result.


SpaceX was first made aware of the problems after SQA found in a January 2018 internal audit multiple falsified source inspection reports and non-destructive testing certifications.


SpaceX had at the time planned to use those parts for its Transiting Exoplanet Survey Satellite (TESS), a NASA mission that launched from a SpaceX Falcon 9 rocket in April 2018.


PMI Industries also shipped to SpaceX at least 76 individual piece parts that had either been rejected during source inspection or were never inspected by SQA, the DOJ reported.


It is not clear whether Smalley was involved in those shipments, or what SpaceX did with them.


Took the act of forgery to a new level


The DOJs criminal case into Smalley is the result of a joint investigation by NASA, the U.S. Air Force Office of Special Investigations, and the FBI.


U.S. Attorney James Kennedy said in a statement: The success of Americas reinvigorated space program depends not just on American ingenuity but on American integrity as well.


These charges make clear that those who commit fraud against NASA, the Air Force, the National Oceanic and Atmospheric Administration, who are among the government agencies leading our space program, and those private companies, such as SpaceX, with whom the government partners and contracts in its effort, will be held accountable when they seek to cut corners, he said.


FBI Buffalo Special Agent-in-Charge Gary Loeffert added: According to the criminal complaint, James Smalley took the act of forgery to a new level.


A potentially catastrophic level with the potential to not only cost millions of dollars, but also jeopardize years of irreplicable work.


Copyright 2019 Newstex LLC All Rights Reserved.

Copyright © LexisNexis, a division of Reed Elsevier Inc. All rights reserved.  
Terms and Conditions   
Privacy Policy

Quality News Today is an ASQ member benefit offering quality related news
from around the world every business day.

Click here to read the entire article: Quality News Today.

Honda Recalls 137,000 New SUVs After Three People Were Injured by Sudden Airbag Explosions in the U.S.

Click here to read the entire article: Quality News Today.

Pharma’s Desperate Struggle to Teach Old Data New Tricks

Forbes.com

May 23, 2019

By David Shaywitz


While pharma C-suite executives find themselves increasingly seduced by the promise of digital transformation, and especially by the idea of leveraging AI, the lived, on the ground reality within virtually all pharma R&D organizations couldnt be further removed.


Novartis CEO Vas Narasimhan candidly alluded to this in January when he reflected on his companys digital transformation journey, and said, The first thing weve learned is the importance of having outstanding data to actually base your ML on. In our own shop, weve been working on a few big projects, and weve had to spend most of the time just cleaning the data sets before you can even run the algorithm. Thats taken us years just to clean the datasets. I think people underestimate how little clean data there is out there, and how hard it is to clean and link the data.


This is arguably the core problem underlying most datasets in health and in pharma, impeding not only the ability to leverage AI, but also just to understand, at the most basic level, the data within your own organization. Essentially, any effort to glean novel insight from existing data represents a maddeningly difficult challenge.


The enormous magnitude of this challenge was highlighted this week in an outstanding Stat article by Casey Ross, taking readers behind the scenes to learn what was required to run the AI-driven analysis of lung cancer diagnosis reported by Google and others. In short, Googles AI was used to assess chest CT scans to determine whether or not lung cancer was likely present, and compared to the performance of human radiologists; the algorithm seemed to do better than the radiologists when looking at scans representing a single point in time, and no worse than radiologists when they were allowed to view previous scans from the same patient.


According to Rosss article, a huge challenge in doing this study was extracting the patient cases (from Northwesterns hospital system) to test the algorithm. Ross quotes Mozziyar Etemadi, a biomedical engineer and anesthesiologist at the Chicago hospital system, who told Ross, It was a pretty crazy engineering challenge. We had to write a ton of software just to communicate between different parts of the [hospitals] electronic records.


Ross goes on to astutely observe, The struggle underscores one of the biggest barriers to the development and use of AI in medicine: Patient information is held in a crazy array of computer systems and formats that defy efforts to build coherent datasets. The data are needed to train algorithms and, as in the Northwestern example, validate their performance on patient cases they have never seen.


Behold: this is the dirty little secret bedeviling all digital transformation efforts in healthcare and in pharma. Moreover, I dont think anyone working on digital and data in healthcare found the Northwestern saga remotely surprising; rather, as Ross points out, it is painfully representative.


Its about the data


Do I believe this? Dr. Amy Abernethy, Principal Deputy Commissioner of the FDA, responded rhetorically, when asked about the data curation challenges encountered by the team performing the Google AI study. Absolutely.


Abernethy is arguably one of the worlds experts on exactly this sort of data wrangling, having spent her entire professional career (as discussed in this Tech Tonics podcast) working on the challenges of leveraging clinical cancer data, both as an oncologist and clinical researcher at Duke and as a senior executive at the data company Flatiron (both before her present role).


As she explains, At the end of the day, the challenges for the application of AI to healthcare arent really about development of the algorithmsits about the data. Sophisticated AI prediction algorithms have been developed in many industries especially those where discrete, voluminous, well-organized and readily-analyzable data are the norm (for example, meteorology and finance). For AI to be applied to healthcare, however, the underlying data need to be organized and readily accessible. That is not the case with most healthcare data.


Just how bad are the data challenges in clinical data? Pretty bad, Abernethy seems to suggest. While data quality challenges can be overcome, she says, they have to be characterized and addressed. In electronic health record datasets (EHRs), many of these complexities are magnified because data are inconsistently coded (or not coded at all) making the data difficult (if not impossible) to merge. Data quality problems abound, such as cut and paste errors and missing parts of the patients longitudinal record. And many key patient outcomes are not characterized or coded.


Turns out, this creates problems, particularly when youre trying to develop algorithms. Abernethy explains, At least in the near-term, the smartest AI algorithms in healthcare are developed with labeled datasets, i.e., datasets where key features are consistently labeled in a reliable and codified fashion. For example, if you want to generate algorithms to predict that this patient has breast cancer, it is best to build the prediction algorithm using data that accurately labels the patient case as having breast cancer or not.


Abernethys suggestion (presumably informed by her Flatiron experience): human curation, at least for now: One recent example of the challenges of the middleware of data for AI was when it was disclosed that real humans were listening to Alexa recordings to support algorithm development. Should we have been surprised? As of now, it is hard to imagine reliable data from the real world where we dont need human curation and/or cross check. That is what you are seeing here in the Google AI for lung cancer screening example. It turns out that even some of the most sophisticated tech companies in the worldAmazon, Googlehave data problems. Data are at the core.


Data: Asset or liability?


As Eric Perakslis, a health data guru and Rubenstein Fellow at Duke, explains, data in both health systems and clinical trials (and, Id add, other biopharma research) are generally collected for extremely specific purposes. In clinical encounters, the data is logged into an EHR, and used for building a longitudinal history and tallying up procedures for billing. In clinical trials, you are filling in boxes in the physical implementation of a clinical protocol, a statistics database. Both, Perakslis says, are forms of data management where the goal is efficiency, compliance and effective sausage making.


The challenge, he continues is: Learning anything secondary, also known as knowledge management, from either is usually an afterthought and requires data to be extracted, reformatted and re-homed into an additional structure such as a data warehouse/mart/lake. This requires additional labor and additional risk as duplicate data amplifies cost as well as compliance, security and privacy risks. For these reasons, it is seldom done unless funded and prioritized via leadership. This can be messy and expensive to bean counters but, Id argue, should be prioritized, because if done correctly, data should be the second most valuable asset, after talent, in any scientific organization.


Like Abernethy, Perakslis acknowledges the effort involved in actually arriving at usable data: Given the archaic infrastructure of most large institutions data curation, cleaning and transformation amounts to manual hand-to-hand combat between highly educated humans and text interfaces. It takes forever, costs a fortune and simply should be avoided. People should think things through from day one and know that it makes more sense to lay plumbing and conduit for the anticipated addition when you design the house, not after.


Adds Perakslis, randomized control trials cost anywhere from $30,000-$50,000 or more per patient. For less than $1000 more (per patient), you could create data files in a modern lake ready for AI, ML or human mining. Why arent we all spending the extra two cents per sausage even if our primary job is sausage making? Treating data as an asset versus a liability should be the key. (Ive also discussed the contrast between the positive optionality tech companies see in data vs the negative optionality many biopharma companies perceive.)


Data > algorithm


Imran Haque, another savvy data scientist, agrees that data wrangling is generally the rate-limiting challenge for projects like Googles seeking to apply AI to health data: 90+% of effort in real-world machine learning projects will end up focused on mundane data cleaning and data management, not exciting models and algorithms work. To a first order, quality and scale of data are much more important than the particular type of ML algorithm used on the data: it can be surprisingly hard to beat simple logistic regression.


Haque notes that academic ML model/algorithm research largely focuses on a handful of manually-curated benchmark data sets, partially to abstract away the huge amount of work it takes to collect a dataset and focus instead on the modeling problem, and partially to provide a means of comparing results in different papers. Unfortunately, for those seeking to solve the sort of real-world problems like those facing hospitals and pharma R&D groups, data is not usually served in a nicely prepackaged form.


Generating their own data


Because of the challenge of the data quality problem, some innovators seeking to leverage AI have decided to develop their own datasets. As Chris Gibson, CEO of Recursion Pharmaceuticals, a leading company in this space, explains, To ask complex questions of biology and model it using the power of AI & ML, one must ideally generate data specifically for that purpose. Most data available publicly in large datasets, developed by biopharma companies or large health systems, is not designed to be used to ask the kinds of questions we and others need to ask. Data like this requires a Herculean effort to clean and prepare it for machine learning use, and is often collected in ways that introduce worrisome biases.


Hence, he decided to generate the necessary data, on his own. Because of these issues, he explains, weve spent more than five years at Recursion building out the industrys largest biological images dataset, generated all in-house. Today we do nearly 350,000 carefully controlled experiments each week in a variety of human cell types from which we generate, among other things, fluorescent microscopy images that have a tremendous depth of information about biological states within them.


Similarly, Daphne Koller, an AI wizard (listen to her Tech Tonics podcast) and CEO and founder of another leading company, insitro, notes, Machine learning is only as good as the data you feed it. A key rate-limiting factor to the application of machine learning in biomedicine is the lack of high-quality data thats fit to purpose. At insitro, we are building a bio-data factory that leverages a range of cutting edge technologies to produce high-quality data at enormous scale. This enables us to identify problems where having more predictive models would be transformative, and then generate data specifically to enable machine learning to be applied to those problems.


Bottom line: Data parasite redux?


While many healthcare systems and pharmaceutical companies seem to be embarking on digital transformations, the continued reality for both is that virtually all data continue to be collected in an outmoded fashion that enables the information to fulfill its original purpose, very concretely defined (billing and rudimentary patient history documentation in the case of EHRs, collection of very specific data for pre-defined statistical analyses in the case of clinical trials), it is nearly prohibitively difficult, if not truly prohibitively difficult, to garner additional value from these data.


In the short term, some of these hurdles can be overcome through Herculean curation efforts involving an exceptional amount of manual labor. Going forward, it remains to be seen whether stakeholdersmeaning hospital systems and biopharma companieswill evolve to a more modern model that would enable far greater downstream utility, or will keep on keeping on, pursuing the occasional curation project but largely avoiding profound changes to the underlying data philosophy and architecture.


Already, innovators seeking to leverage contemporary AI techniques have recognized the need to create their own, largely preclinical, data. It will be interesting to see if these approaches ultimately generate the profound insights anticipated by their founders, a result that could potentially motivate wider adoption.


I am particularly curious about whether any pharma will truly evolve to a contemporary approach to data collection and management, given that this represents a profound organizational challenge since the benefits accrue to future or adjacent stakeholders (who could benefit from new uses for the data) while the burdens and inconveniences are borne largely by those who most directly need the data for the originally intended use, today. (This can be regarded as just another manifestation of the data parasite tension of several years ago.)


Appealing as the vision might be, getting to this data science mindset, as Ive written, seems like a really high hurdle for an organization to overcome. Changing this calculus will require several viscerally and financially compelling examples of successfully using data for secondary purposes; optimistic assertions of datas exceptional potential may motivate C-suite executives to call for change, but are unlikely to motivate those on the front lines to drive it.


Copyright 2019 Forbes LLC All Rights Reserved.

Copyright © LexisNexis, a division of Reed Elsevier Inc. All rights reserved.  
Terms and Conditions   
Privacy Policy

Quality News Today is an ASQ member benefit offering quality related news
from around the world every business day.

Click here to read the entire article: Quality News Today.

Automotive Industry Action Group Releases Contractor Safety Management Framework

Targeted News Service

May 23, 2019

In response to a call from industry for greater consistency and attention surrounding contractor safety, AIAG announces the latest launch in its health and safety initiative.


The Automotive Industry Action Group (AIAG) today announced the release of a new publication detailing a unified approach to contractor safety management in the supply chain.


Created at AIAG in collaboration with automakers, suppliers and contractors, the new Contractor Safety Management Framework outlines common processes for effective contractor safety managementspecifically in construction environmentsfacilitating more consistent health and safety requirements for contractors among hiring clients and owners. It can also be used within any health and safety (H&S) management system (e.g., ISO 45001, OHSAS 18001, ANSI Z10).


Our team worked with a diverse group of professionals from a range of AIAG member companies to gather information on health and safety programs, related training options and other industry insights, Lecedra Welch, AIAG program manager, environmental sustainability, explains. The resulting document allows users to build upon the framework with their own specific criteria, such as adapting the processes for global use or incorporating local regulatory and legislative requirements.


Intended to support contractors and companies in identifying and reducing health and safety risks and hazards, successful implementation of the frameworks processes will also help expand and improve the collaboration between hiring clients/owners and contractors. As such, tasks such as pre-qualifying a contractors safety program prior to issuing a contract and tracking the contractors safety performance once he or she begins the project are also addressed in the framework document.


Welch concludes: Contractor safety is an essential element of any organizations overall health and safety programand not only because contractors are typically engaged in high-risk, non-standard work. When we work collaboratively to support and champion safety efforts throughout the supply chain, we all benefit.


About AIAG: The Automotive Industry Action Group (AIAG) is a unique not-for-profit organization where OEMs, suppliers, service providers, government entities, and individuals in academia have worked collaboratively for more than 36 years to drive down costs and complexity from the automotive supply chain. AIAG membership includes leading global manufacturers, parts suppliers, and service providers. Visit www.aiag.org for more information.


Copyright 2019 Targeted News Service LLC All Rights Reserved.

Copyright © LexisNexis, a division of Reed Elsevier Inc. All rights reserved.  
Terms and Conditions   
Privacy Policy

Quality News Today is an ASQ member benefit offering quality related news
from around the world every business day.

Click here to read the entire article: Quality News Today.