Search the Community

Showing results for tags 'ibm'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Favorites
  • General Discussion
    • Introduce Yourself
    • General Discussion
    • Jokes & Funny Stuff
    • Members' Metropolis
    • Real World News
  • Technical Help & Support
    • Hardware Hangout
    • Smart Home, Network & Security
    • Tips, Tweaks & Customization
    • Software Discussion & Support
    • Programming (C#, C++, JAVA, VB, .NET etc.)
    • Web Design & Development
  • Platforms (Operating Systems)
    • Microsoft (Windows)
    • Apple (macOS)
    • Linux
    • Android Support
  • Submitted News, Guides & Reviews
    • Essential Guides
    • Back Page News
    • Member Reviews
  • Recreational Activities
    • Gamers' Hangout
    • The Neobahn
    • The Media Room
    • The Sporting Arena
  • Neowin Services & Support
    • Site Announcements
    • Site & Forum Issues

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

  1. The Intel 8088 processor launched 44 years ago today, and helped to start the PC revolution by John Callaham If you are working on a PC today, there's a good chance that it's using a chip that can trace its roots down to the Intel 8088 processor. That chip launched 44 years ago today, on June 1, 1979. However, its true impact on the PC industry would have to wait for a while longer. First, let's look at the hardware specs of the Intel 8088, via the company's own website. Clock speed - 8 MHz, 4.77 MHz Manufacturing process - 3-micron Number of transistors - 29,000 Addressable Memory - 64 kb Bus Speed - 8 MHz, 4.77 MHz The Intel 8088 is actually a slightly different version of the Intel 8086, which launched a year before in June 1978. Both chips had 16-bit registers. The main difference between the two CPUs is that while the 8086 had a 16-bit data bus, the 8088 only had an 8-bit data bus. This small difference would be the key to the wider use of the 8088 later. In the late 1970s and early 1980s, the personal computer industry was just starting. Companies like Apple, Commodore, Tandy, and even video game console maker Atari were releasing their own PC models. IBM, known previously for its huge mainframe computers meant for large corporations, decided to get in on this new market and launch a PC of its own. Inside of designing its first PC completely on its own, as it had with its previous computers, IBM contacted third parties to help make its first PC product. The reasoning was that IBM could quickly put together a PC and put it on the market faster than if it did everything in-house. IBM's site stated: They went to Microsoft for the operating system (QDOS, renamed PC-DOS and later sold by Microsoft as MS-DOS) and to Intel for its 8088 processor. They chose an existing monitor from IBM Japan and a dot-matrix printer by Epson. Only the keyboard and the system unit itself were new designs from IBM. So why did IBM choose the Intel 8088 processor to be in its first PC? There's actually some debate on this subject. Microsoft co-founder Bill Gates stated in a 1997 interview with PC Magazine that he and fellow co-founder Paul Allen actually pushed IBM to use a 16-bit processor. However, David Bradley, who helped to put together the first IBM PC for the company, tells a different story in an article he wrote for Byte in 1990. He offered four main reasons for picking a processor like the Intel 8088: 1. The 64K-byte address limit had to be overcome. This requirement meant that we had to use a 16-bit microprocessor. 2. The processor and its peripherals had to be available immediately. There was no time for new LSI chip development, and manufacturing lead times meant that quantities had to be available right away. 3. We couldn't afford a long learning period; we had to use technology we were familiar with. And we needed a rich set of support chips—we wanted a system with a DMA controller, an interrupt controller, timers, and parallel ports. 4. There had to be both an operating system and applications software available for the processor. So why did IBM ultimately pick the 8088 over the 8086? Bradley said that the final choice was due to a familiar reason: it helped make the PC cheaper to produce: We chose the 8088 because of its 8-bit data bus. The smaller bus saved money in the areas of RAM, ROM, and logic for the simple system. The first IBM PC launched on August 12, 1981 with a price of $1,565. It quickly became a sales success and led not only to more IBM PC models, but also PCs made by other companies that were clones of the IBM product. They all used versions of Intel's x86 chip line. Today, the 13th Gen Intel Core processors that the company is currently selling can trace their roots back to that original 8088 model. The company is currently getting ready to launch its next chip architecture, Meteor Lake, and its also in early, early development of a 64-bit only CPU. It tried to get away from that x86 architecture with its server-themed 64-bit chip Itanium, in the 2000s but it failed to make a significant impact. However, even future Intel chips will owe a debt to the Intel 8088 that launched 44 years ago.
  2. IBM worker, off for 15 years, speaks up about his situation and discloses illness by Paul Hill A couple of days ago, Neowin reported on the IBM worker Ian Clifford who sued IBM to try and get a pay rise despite not working since 2008. Mr Clifford has now publicly commented on his situation to The Telegraph explaining that he had sued for a pay rise to help his family’s financial security as he has been diagnosed with leukaemia and is on chemotherapy. It was reported initially that he would receive his pay until he got to age 65, which is true, but Mr Clifford says that his “life is being curtailed” and that the chance of him living to 65 is “highly unlikely.” Aside from publicly sharing his illness, he also said that the pay rise he wanted from IBM was just 2.5 per cent, much lower than the above 10 per cent inflation that has been seen in the UK. According to The Telegraph, Mr Clifford has now lodged an appeal against the ruling to get another shot at the pay rise. Explaining his situation, Mr Clifford said: “I am on chemotherapy and have been for many years and have been extremely unwell. Your salary affects your death in service [insurance], pension and everything else, it was more for my family. People may think, yes it's generous, but firstly those amounts are gross not taxed. ... I do pay National Insurance on those amounts. I have a son [who is] off to university. Your mortgage doesn't go down because you are sick. I had to use all my savings to bring this case and more and had to borrow money on a credit card… it's left me financially very vulnerable. People will still think it's greedy but at the end of the day, yes it's unfortunate, but that was a benefit I got with the job. My life is being curtailed, the chances of me living to 65 is highly unlikely.” If you missed the initial reporting, essentially Ian Clifford was actively working for IBM until 2008, at which point, he fell ill and took advantage of an IBM health plan that was available to him. On that plan, he would receive 75% of his £72,037 ($89,671) salary until he reaches age 65. Unfortunately, he did not receive pay increases like the active workers at IBM and with the rate of inflation running high recently, he tried suing the company for a pay rise. It has also been disclosed that Mr Clifford and his lawyers made two separate offers to IBM before taking the case to court. The court took the side of IBM on the grounds that the lack of pay rise was not disability discrimination because he was being treated more favourably than other employees who still have to go to work for their pay. The case has invoked mixed reactions, some people think his claim is legitimate while others think his existing pay is satisfactory, especially seeing as it’s well above the UK average wage and he’s not actively working. Source: The Telegraph
  3. Inactive IBM employee who hasn't worked since 2008 sued company after getting no pay rise by Paul Hill An IBM employee, Ian Clifford, who has been on sick leave since 2008 on an IBM health plan, has recently tried to sue his employer for not giving him a pay rise. With inflation being so high recently, many people have noticed their spending power declining, including Mr Clifford. As part of his health plan, Mr Clifford is entitled to 75% of his salary, which is £72,037 ($89,671). After the 25% deduction, he receives £54,028 ($67,254). As part of the plan, he is still counted as an IBM employee but is under no obligation to do any work. Mr Clifford’s complaint was made on the basis that he has not received a salary increase since 2013 and with inflation so high, his current income “would soon wither”. According to Mr Clifford, the point of the health plan is to offer security to those who can’t work but with the payments frozen at their current levels, he doesn’t believe the plan offers the promised security. For comparison, the median average salary for those working full-time in the UK is £33,000 ($41,078) per year and the median for all employees is £27,756 ($34,550). In response to his case, Judge Paul Housego decided to dismiss the matter. He said the payment was only available to those who were disabled and could not, therefore, be considered less favourable treatment related to disability. In fact, he said, it’s more favourable treatment, not less. According to The Telegraph, which first reported on the story, Mr Clifford now considers himself “medically retired” on his LinkedIn page. The case was brought by Mr Clifford in February of this year. He is set to continue receiving payments until his 65th birthday, by which time, he'll have collected more than £1.5 million ($1.86 million) in total. Source: The Telegraph (Yahoo! Finance)
  4. IBM says it will pause hiring for jobs to be replaced by AI automation by Karthik Mudaliar IBM says that it is expecting to pause hiring for around 7,800 jobs that could be replaced by Artificial Intelligence (AI) in the coming years. The reduction in jobs would also include not filling back the roles vacated by attrition. Company CEO Arvind Krishna, in an interview with Bloomberg, said that the job freeze would affect the back-office functions such as human resources the most. Krishna said, "I could easily see 30% of that getting replaced by AI and automation over a five-year period”. The use of AI to automate tasks like customer service, writing, and coding has worried the general workforce. Krishna believes that routine tasks such as employee verification letters or transferring staff between departments will be completely automated, while some HR functions like assessing workforce composition and productivity will remain unaffected for the next ten years. IBM, earlier this year, had disclosed its plans to lay off approximately 3,900 employees. Despite that, IBM continues hiring employees for software development and customer-facing roles. Krishna said that IBM has added around 7,000 people in the first quarter of this year. Source: Bloomberg
  5. The IBM PC-XT launched 40 years ago today but it got competition from the Compaq Portable by John Callaham IBM PC-XT In 1981, the first personal computer desktop was launched by IBM. However, the IBM PC had some limitations, such as the lack of a hard drive and too few expansion ports. 40 years ago today, on March 8, 1983, the company came out with its next-generation desktop computer, the IBM PC-XT. One of the two big hardware improvements in the IBM PC-XT was an increase in expansion slots from five in the original IBM PC to eight in the new PC. This allowed for owners to add more hardware, like an additional floppy disk drive or another hard drive. Speaking of which, the second big hardware improvement was the addition of a Seagate 10 MB hard drive. This allowed the IBM PC-XT to boot up the PC-DOS 2.0 operating system from the hard drive, rather than rely on using a floppy disk. The PC did come with a 5.25-inch floppy disk drive as well. Later models were sold without the included hard drive. Unfortunately, the company decided that the IBM PC-XT would get the Intel 8088 microprocessor, which is the same processor that shipped with the original IBM PC a year and a half earlier. The IBM PC-XT originally was sold with 128KB of memory, but later versions had 256 KB and finally 640 KB of memory on the motherboard. You had to purchase the monitor for this PC as an accessory or have it bundled with the computer. You can find out a lot more about the hardware specs on the PC from the DOS Days website. The original IBM PC had a starting price of $1,565 when it launched in 1981 according to PC Mag. By contrast, the price for the first model of the IBM PC-XT was a whopping $7,545, thanks to that included hard drive. That's part of the reason why the company sold later versions without the hard drive to make it more affordable. One model with just a single floppy drive and a bundled monochrome monitor sold for £1,736 in the UK, or about $2,058. By the time the IBM PC-XT was released, other companies were coming out with their own personal computers that were IBM PC-compatible. One of them was released around the same time as the IBM PC-XT, the Compaq Portable. It was the first PC in what turned out to be a long line of Compaq personal computers. Compaq Portable It had a built-in 9-inch green phosphor CRT, but the original lacked the built-in hard drive that the IBM PC-XT had included. Another thing about the Compaq Portable is that while its BIOS was coded from scratch, it was made to run any apps that the IBM BIOS could. It even came with its own carrying case (hence the "Portable" in the title) When it launched, the Compaq Portable cost over $3,000, but according to ZDNet, the PC still sold about 53,000 units in its first year. More importantly, it started the "IBM PC compatible" era, as more personal computers from other companies soon joined in to compete directly with IBM. It was the true start of the PC industry that continues to this day. What was your first PC, and when did you acquire it? Let us know in the comments below.
  6. IBM beats revenue expectations, to let go of 3,900 employees by Justin Luna During IBM's latest conference call reporting its financial performance for the fourth quarter of 2022, the company announced that its operating profits and revenue met analysts' expectations. The company also reported solid growth in its cloud, artificial intelligence, and data analysis software business. However,it also said that it will let go of 3,900 workers. This number is equivalent to about 1.5% of IBM's global workforce. According to the company, the job cuts is a result of earlier asset sales instead of a weakness in its business. In 2021, IBM spun off its managed infrastructure services business Kyndryl into a separate company. More recently, the firm sold off its Watson Health analytics business to a private equity firm. IBM will spend about $300 million in the first quarter of this year to pay for the severance packages of those who will be let go. Despite this development, the company still expects to hire in "higher-growth" areas. IBM will join the wave of many technology companies that laid off its employees. In November of last year, Twitter significantly shed its workforce after it was acquired by Tesla CEO Elon Musk. A few days later, Meta announced the layoff of over 11,000 employees. A few months after, Microsoft and Amazon also announced job cuts as a part of broader cost-cutting measures. Finally, Google and Spotify also bade goodbye to a significant portion of their staff. Source: Bloomberg (paywall)
  7. IBM India: moonlighting allowed with permission, but only for good causes by Karthik Mudaliar IBM India's managing director Sandip Patel wrote an email to employees indicating that they can indulge in their passions, as long as it is not at the expense of IBM's interests. "The moonlighting concept can cause a lot of confusion if not clarified at a granular level which is why I am writing you," he wrote in his email addressed to IBM India employees. However, his email does not provide a granular clarification. Instead, further muddles the water by telling employees that the company values their passions, but should also keep in mind if their passions conflict interests with IBM. At IBM, our stance has always been clear: we encourage every IBMer to bring their whole selves to work. Your passion – be it for art, dance or music – is celebrated here, and in that spirit, we'd love to see you pursue your interest. However, if you advance a personal interest, directly or indirectly, at the expense of IBM's interests, it is treated as a serious conflict of interest and a violation of trust. The term moonlighting essentially means the activity of an employee using their spare time to work at another place. "If it's gray, stay away or ask for clarification", Patel further added. Moonlighting has spurred a lot of controversy in India. Last month, Wipro fired 300 employees for taking extra work after the company's chairman tweeted that moonlighting is cheating. There is a lot of chatter about people moonlighting in the tech industry. This is cheating - plain and simple. — Rishad Premji (@RishadPremji) August 20, 2022 Tata Consultancy Services' chief operating officer N Ganapathy Subramaniam also labeled moonlighting as an ethical issue that damages the culture that gives short-term gain for long-term pain. Source: The Register
  8. The Next Tech Revolution: What Your Business Needs To Know To Stay Ahead - Free Webcast by Steven Parker Download your complimentary on-demand webcast today. With modern, hybrid environments the new norm, companies are equipping themselves with platform agnostic, field-friendly, scalable cloud environments to work faster and smarter than ever before. However, as companies adopt emergent technologies and environments become more complex, disparate, and inefficient systems prevent companies from seeing true ROI from their investments. In this panel discussion, Fast Company and IBM will explore the latest developments within hybrid cloud computing, data use, insights, and how to integrate these technologies on your own terms to become a true catalyst for innovation. Featured speakers include: Dr. Talia Gershon (IBM), TBD, and Greg Lindsay (FastCo Works moderator) How to get it Please ensure you read the terms and conditions to claim this offer. Complete and verifiable information is required in order to receive this free offer, or download with LinkedIn. If you have previously made use of these free offers, you will not need to re-register. While supplies last! The Next Tech Revolution: What Your Business Needs To Know To Stay Ahead - Free Webcast Offered by IBM, view other free resources | Limited time offer Not for you? That's OK, there are other free eBooks on offer you can check out here. Ivacy VPN - 5 years at 87% off NordVPN - 2 years at up to 68% off Private Internet Access VPN - subscriptions at up to 79% off Unlocator VPN or SmartDNS - unblock Geoblock with 7-day free trial Subscribe to Neowin - for $14 a year, or $28 a year for Ad-Free experience Disable Sponsored posts · Neowin Deals · Free eBooks · Neowin Store Disclosure: A valid email address is required to fulfill your request. Complete and verifiable information is required in order to receive this offer. By submitting a request, your information is subject to TradePub.com's Privacy Policy.
  9. Canonical sees 240% YoY growth in its channel partner program bolstering business products by Paul Hill Canonical, the company which distributes the popular Linux distribution, Ubuntu, has announced that its channel partner program, which began four years ago, has seen 240% growth year-over-year (YoY). Through the program, Canonical’s partners are able to resell and distribute products that Canonical develops. Canonical said that the growth was driven primarily through distribution partners in Europe and North America – it grew existing partnerships and brought more onboard. The last year saw Canonical work closer with major OEMs including Dell and IBM so that they could build embedded products and data centre-oriented enterprise solutions. Commenting on the news, Regis Paquette, Vice President of Global Alliances and Channels at Canonical, said: “I am extremely honored to be included again in the Channel Chiefs list. Our partners are vital to Canonical’s success and I am proud of the work we’ve done together in serving our global customers. This recognition is a testament to our program’s success, to Canonical’s ability to become the conduit of open source for our partners and customers, and I look forward to its continued growth and momentum.” Going forward, Canonical said it’ll also be focusing on areas such as telecommunications, automotive, and finance. In the auto space, it wants to get its solutions into next-gen vehicles to provide high-performance computing capabilities. It said that its proven track record of delivering security, stability, and 24/7 reliability in data centres makes it an ideal choice for the automotive sector. If you'd like to learn more about the channel partner program, head over to Canonical's website to get more details.
  10. Google, Microsoft, and more to advise UK government on international data transfers by Usama Jawad A few days ago, Google penned a blog post emphasizing the importance of a robust EU-U.S. data transfer framework and while major conversations surrounding this topic haven't officially kicked off yet, what we do have happening today is big tech firms meeting with the UK government to advise on international data transfers. In its first meeting, the International Data Transfer Expert Council will provide advice to the UK government about how it can unlock the "benefits of free and secure cross-border data flows", considering that the country has left the European Union (EU). The government has noted that the full potential of its trading capabilities has not been utilized because of the hurdles related to data transfer. UK Data Minister Julia Lopez went on to say that: Realising the benefits of international data flows has never been more important. We want the UK to drive forward cutting-edge policies at home and overseas to ensure people, businesses and economies benefit from safe and secure data flows. Today we’re launching a new panel of global experts to help us achieve these aims and I will lead the first meeting so together we can deliver a world-leading and truly global data policy for the future. Today will be the first of many meetings held by the International Data Transfer Expert Council, which will meet on a quarterly basis. It includes representatives from many notable tech companies and institutes including Microsoft, Google, Mastercard, IBM, World Economic Forum, and more. Although the UK already has several data protection laws, the council will focus on new laws for international transfer of identifiable and non-identifiable data. The effort includes ensuring that the trading country where the data is being transferred to also has data laws similar to that of the UK. Prioritized countries and areas for this endeavor include the U.S., Australia, the Republic of Korea, Singapore, the Dubai International Finance Centre, and Colombia. The idea will be to make the UK "a global leader in removing barriers to cross-border data flows" under the government's National Data Strategy.
  11. LG joins the IBM Quantum Network to improve its quantum computing competency by Paul Hill LG has announced that it has joined the IBM Quantum Network to help improve its own competency in the field of quantum computing and utilise new technologies as they are unlocked. Through the partnership, LG will gain access to IBM’s quantum computers, expertise, and an open-source quantum information software development kit called Qiskit. Commenting on the partnership, Kim Byoung-hoon, CTO and executive vice president of LG Electronics, said: “Based on our open innovation strategy, we plan to use IBM Quantum to develop our competency in quantum computing. We aim to provide customers with value that they have not experienced so far by leveraging quantum computing technology in future businesses.” LG will use its new position in the IBM Quantum Network to explore how the new technology can be leveraged to process large amounts of information for the deliverance of services such as artificial intelligence, connected cars, digital transformation, Internet of Things, and robotics applications. LG will train its workforce in quantum computing too so that they can help deliver the improvements quantum computing promises. More broadly, the IBM Quantum team and LG will be researching the use of quantum computing in other fields such as finance, energy, chemistry, materials science, optimisation, and machine learning. As breakthroughs are made in quantum computing by the IBM Quantum Network, LG will be right there ready to reap the benefits.
  12. IBM's ModelMesh goes open-source, enabling developers to deploy AI models at scale by Usama Jawad Model serving is a critical component of AI use-cases. It involves offering an inference from an AI model in response to a user request. Those who have dabbled in enterprise-grade machine learning applications know that it is usually not one model providing an inference, but actually hundreds or even thousands of models running in tandem. This is a very expensive process computationally as you can't spin up a dedicated container each time you want to serve a request. This is a challenge for developers deploying a large number of models across Kubernetes clusters because there are limitations such as the maximum number of pods and IP addresses allowed as well as compute resource allocation. IBM solved this challenge with its proprietary ModelMesh model-serving management layer for Watson products such as Watson Assistant, Watson Natural Language Understanding, and Watson Discovery. Since these models have been running in production environments for several years, ModelMesh has been thoroughly tested for various scenarios. Now, IBM is contributing this management layer to open-source complete with controller components as well as model-serving runtimes. ModelMesh enables developers to deploy AI models on top of Kubernetes at "extreme scale". It features cache management and also acts as a router that balances inferencing requests. Models are intelligently placed in pods and are resilient to temporary outages. ModelMesh deployments can be upgraded with ease without any external orchestration mechanism. It automatically ensures that a model has been fully updated and loaded before routing new requests to it. Explaining the scalability of ModelMesh with some statistics, IBM went on to say that: One ModelMesh instance deployed on a single worker node 8vCPU x 64G cluster was able to pack 20K simple-string models. On top of the density test, we also load test the ModelMesh serving by sending thousands of concurrent inference requests to simulate a high traffic holiday season scenario that all loaded models respond with single digit millisecond latency. Our experiment showed that the single worker node supports 20k models for up to 1000 queries per second and responds to inference quests with single digit millisecond latency. IBM has contributed ModelMesh to the KServe GitHub organization that was developed jointly by itself, Google, Bloomberg, NVIDIA, and Seldon back in 2019. You can check out the ModelMesh implementation contributions in the various GitHub repositories mentioned below: Model serving controller ModelMesh containers used for orchestrating model placement and routing Runtime Adapters modelmesh-runtime-adapter - the containers which run in each model serving pod and act as an intermediary between ModelMesh and third-party model-server containers. It also incorporates the "puller" logic which is responsible for retrieving the models from storage triton-inference-server - Nvidia's Triton Inference Server seldon-mlserver - Python MLServer which is part of KFServing
  13. U.S. President Joe Biden plans to meet with Microsoft, Apple, and Amazon CEOs tomorrow by Usama Jawad The U.S. government collaborating with big tech firms isn't exactly something new. Former U.S. President Donald Trump appointed CEOs from multiple companies such as Microsoft, Tesla, Apple, Uber, Alphabet, IBM, and more as his strategic advisors. Many of these companies, including Microsoft, departed from this advisory body following strained relations. Now, it appears that sitting U.S. President Joe Biden wants to mend bridges with big tech firms and has invited numerous CEOs to the White House. Bloomberg reports that president Biden will be meeting with representatives from tech firms tomorrow. In terms of names we are aware of so far, Tim Cook from Apple, Satya Nadella from Microsoft, and Andy Jassy from Amazon have been invited to the gathering. All three of these are CEOs at their respective firms. Apart from that, Google, IBM, and JPMorgan Chase have also been invited. That said, it's unclear how many of them will actually attend the meeting. While the exact agenda of this meeting is currently unclear, it's highly likely that cybersecurity and growing digital threats will make up the meat of the discussion. In the past few months or so, we have seen massive cyberattacks on SolarWinds, Colonial Pipeline, Kaseya, Microsoft Exchange Server, and more. For its part, Microsoft welcomed the inauguration of Joe Biden as the President of the United States back in November, and stated that bridges need to be built to unite people. In May, the president also signed an executive order urging private companies to collaborate with the government to strengthen the nation's cybersecurity defenses. Microsoft is already leading this initiative with its Zero Trust security models. Source: Bloomberg
  14. IBM's new Telum chips will help infer enterprise workload, detect fraud, and more by Karthik Mudaliar IBM has unveiled its new 7nm Telum processor at the Hot Chips 33 conference. The processor is designed to bring deep learning inference to enterprise workloads, which can help address frauds in real-time. After three years of development, IBM has finally managed to create this breakthrough technology. The first system to feature the chip is expected to come out in the first half of 2022. The chip is designed to help customers achieve business insights at scale across banking, finance, trading, insurance, and customer interactions. With the new processor, applications can run efficiently where the data resides. IBM says that this approach is far better than traditional enterprise AI approaches that tend to require significant memory and data movement capabilities to handle inferencing. A collection of IBM Telum 7nm processors on a silicon wafer Since the new processor is now in close proximity to mission-critical data and applications, it will be easier for enterprises to conduct high volumes of inferencing for real-time sensitive transactions, without having to invoke platform AI solutions that might impact performance in the long run. IBM stated: "The chip contains 8 processor cores with a deep super-scalar out-of-order instruction pipeline, running with more than 5GHz clock frequency, optimised for the demands of heterogenous enterprise-class workloads." IBM further added that it has completely redesigned the cache and chip-interconnection infrastructure that provides 32MB cache per core, and can scale to 32 Telum chips, and that the new dual-chip module design contains 22 billion transistors and 19 miles of wire on 17 metal layers. While IBM has developed the processor, it was actually built on the 7nm extreme ultraviolet technology by Samsung and is the first chip made by the IBM Research AI Hardware Center. The research center has also recently announced scaling to 2nm mode which is a benchmark in silicon and semiconductor innovation.
  15. New research shows that near-term quantum computers can learn to reason by Ather Fawaz The applications and development of quantum computers have steadily picked up pace in the last few years. We've seen researchers applying this novel method of computation in a variety of domains including quantum chemistry, fluid dynamics research, open problems, and even machine learning, all with promising results. Continuing this trend, UK-based startup Cambridge Quantum Computing (CQC) has now demonstrated that quantum computers "can learn to reason". Confusing at first, this claim is based upon new research coming out of CQC. Dr. Mattia Fiorentini, Head of Quantum Machine Learning at the firm, and his team of researchers investigated using quantum computers for Variational Inference. Variational Inference is a process through which we approximate a given probability distribution using stochastic optimization and other learning techniques. Jargon aside, this means a quantum computer outputs potential solutions to inferential questions such as that given the grass is wet and it's cloudy, what's the more probable cause for it? Rain or sprinklers? Formally, the question is posed as follows: “What is the state of the unobserved variables given observed data?” These outputs can then be used in downstream tasks such as finding the likeliest reason given the available data or predicting future outcomes and their confidence. The team's work, titled Variational inference with a quantum computer, has been published on the pre-print repository arXiv and highlights what the firm believes to be a promising indicator that quantum computers are great for Variational Inference, and by extension, at reasoning. Outputs from a quantum computer appear random. However, we can program quantum computers to output random sequences with certain patterns. These patterns are discrete and can become so complex that classical computers cannot compute them in reasonable time. This is why quantum computers are natural tools for probabilistic machine learning tasks such as reasoning under uncertainty. In the paper, the researchers demonstrate their results on Bayesian Networks. Three different problem sets were tested. First, was the classic cloud-sprinkler-rain problem that was described above. Second, was the prediction of market regime switches (bull or bear) in a Hidden Markov Model of simulated financial time series. Third, was the task of inferring likely diseases in patients given some information about symptoms and risk factors. Using adversarial training and the kernelized Stein discrepancy, the details of both which can be found in the paper, the firm optimized a classical probabilistic classifier and a probabilistic quantum model, called Born machine, in tandem. Adversarial method Kernelized Stein discrepancy method Once trained, inference was carried out on the three problems defined earlier, both on a quantum simulator and on IBM Q's real quantum computers. In the truncated histograms shown below, the magenta bars represent the true probability distribution, blue bars indicate outputs from a quantum computing simulator, and grey bars indicate the output from real quantum hardware from IBM Q. The results on real quantum computer hardware are marred by noise and this causes slower convergence compared to the simulation. That is to be expected in the NISQ era, however. Truncated histogram of the posterior distribution for a hidden Markov model Histogram of the posterior distribution for a medical diagnosis task The probability distribution of the quantum simulator closely resembles the true probability distribution, indicating that the quantum algorithm has trained well, and that the firm's adversarial training and the kernelized Stein discrepancy methods are powerful algorithms for the intended purpose. We demonstrate the approach numerically using examples of Bayesian networks, and implement an experiment on an IBM quantum computer. Our techniques enable efficient variational inference with distributions beyond those that are efficiently representable on a classical computer. The firm believes that this is yet an indicator that "sampling from complex distributions is the most promising way towards a quantum advantage with today’s noisy quantum devices." And that its new inference methods "can incorporate domain expertise". Moving forward, the firm envisions "a combination with other machine learning tasks in generative modeling or natural language processing for a meaningful impact." Further details can be found in this blog post and the paper on arXiv. If you are interested, you can check out Dr. Mattia's interview on YouTube here.
  16. IBM study: Gender equity isn't a top priority for business by Paul Hill A new study from IBM has found that 70% of global businesses do not regard gender equity in the workplace as a top priority despite efforts to encourage women into the workplace, especially in STEM fields. Huawei is one such firm that has been trying to make software development more inclusive for women but IBM’s data shows that despite a boost in such programs, the mindsets and cultures have not changed enough which is hurting outcomes. IBM refers to businesses that prioritise gender equity as ‘First Movers’ in its report and according to the findings, these businesses reported stronger financial performance, were more innovative, and recorded stronger customer employee and satisfaction. Despite these positive results, the report also found that, among the women surveyed, fewer held senior vice president, vice president, director, or manager roles in 2021 compared to 2019. Commenting on the findings, Senior Vice President of Global Markets at IBM and Senior Executive Sponsor of the IBM Women’s Community Bridget van Kralingen said: “The data show that many women leaders are experiencing challenges at this moment. If these issues are not addressed more deeply than in prior years, there is a risk of progress backsliding further. We should seize creative solutions now and redouble our efforts to make meaningful, lasting change that can help all women reach their full potential.” Those who were surveyed seemed less optimistic about programmatic efforts to address gender equity. 62% of women, down from 71%, and 60% of men, down from 67%, believe that their organization will significantly improve the gender equity situation over the next five years. IBM's solutions To address the issue, IBM recommended several things that businesses can try to encourage more women into lasting jobs. It said that firms should make gender equity a top-five issue and offer pathways for women to re-enter the workforce, for example, if a woman leaves to have a child and decides to get a job a couple of years later, she should be offered training, access to tools and technology, and receive mentorship and work assignments matched to her expertise. Another solution that the firm suggested was that businesses should try to offer benefits such as back-up childcare support and mental health resources. IBM said that the best-performing CEOs ensure the well-being of their workforce even if it comes at the cost of profitability or budget. Finally, IBM said that to accompany programs that get women into jobs, there also needs to be a culture shift that encourages team cultures and is flexible enough to meet people’s personal and professional needs. Before employees are hired, IBM said that businesses should adopt AI and other technologies that can reduce bias during the candidate screening process, boost investment in collaborative tools, and enable employees to work on-site and remotely even after the pandemic ends. Data for this study was gathered from more than 2,600 executives, middle managers, and professionals from 10 industries across nine geographic regions. With the study being about gender equity, IBM made sure that the data it was collected was sourced from an equal number of men and women.
  17. Amazon says 20 more firms join The Climate Pledge by Paul Hill Amazon and Global Optimism have announced that 20 more firms have become signatories to The Climate Pledge, an initiative that calls on signatories to reach net-zero carbon emissions by 2040. The most famous names among the companies joining include IBM and Iceland Foods (a UK-based store) and they join the likes of Microsoft, Uber and Best Buy. The full list of companies that have joined consists of ACCIONA, Colis Prive, Cranswick plc, Daabon, FREE NOW, Generation Investment Management, Green Britain Group, Hotelbeds, IBM, Iceland Foods, Interface, Johnson Controls, MiiR, Ørsted, Prosegur Cash, Prosegur Compañia de Seguridad, Slalom, S4Capital, UPM, and Vanderlande. As signatories, they will agree to measure and report their greenhouse gas emissions at a regular interval, introduce decarbonisation strategies to meet their net-zero goal, and pay into various emission offsetting schemes to neutralise any remaining emissions. Commenting on today’s news, Amazon founder and CEO Jeff Bezos said: “As the U.S. takes an important step forward in the fight against climate change by officially rejoining the Paris Agreement this week, I am excited to welcome 20 new companies to The Climate Pledge who want to go even faster. Amazon co-founded The Climate Pledge in 2019 to encourage companies to reach the goals of the Paris Agreement 10 years early, and we’re seeing incredible momentum behind the pledge with 53 companies from 18 industries across 12 countries already joining. Together, we can use our collective scale to help decarbonize the economy and preserve Earth for future generations.” Amazon said that each of the companies is at a different part in their net-zero journey but that all are committed to adhering to the Paris Agreement ten years earlier than is required of them.
  18. IBM's new roadmap for quantum computing promises 100x speedups and then some by Ather Fawaz One of the pioneers of quantum computing, IBM, revealed its Quantum Development Roadmap for the future of quantum computers today. It builds on the firm's previous roadmap from September 2020, in which it laid out the pathway towards achieving quantum computing ecosystems comprised of thousands of noise-resilient and stable qubits by 2023. This "inflection point", as IBM puts it, is crucial for the full-scale, commercial realization of quantum computers. Since then, the firm has made significant inroads towards achieving this goal, which has been highlighted in the update unveiled today. Firstly, this year, IBM is planning on releasing Qiskit runtime—an execution environment that speeds up the execution of quantum circuits by as much as 100x. Qiskit runtime achieves this substantial speedup by reducing the latency in the communication between classical and quantum computers. By cutting this latency, workloads that take months to run today can be cut down to a matter of a few hours. The Qiskit runtime rethinks the classical-quantum workload so that programs will be uploaded and executed on classical hardware located beside quantum hardware, slashing latencies emerging from communication between the user’s computer and the quantum processor. One of the primary use cases of quantum computers is the simulation of quantum systems, which is an arduous task for classical computers since the computational complexity required to model a system grows exponentially with respect to its size. Today, a simulation of Lithium hydride (LiH) can take up to 100 days. But with the 100X speedup, this task can be done in one day. Moreover, Qiskit runtime will be sizing up the capacity to run a greater variety of quantum circuits, allowing developers to run programs developed by others as a service in their own workloads and eventually tackling previously inaccessible problems with quantum computers. With help from the firm's OpenQASM3 assembly language, technologies designed on OpenShift, by 2023, IBM plans on debuting circuit libraries and advanced control systems for manipulating large qubit fabrics. Cumulatively, IBM boldly claims that come 2023, its quantum systems will be powerful enough to explore major problems with a clear demonstratable advantage over classical computers. Come 2025, IBM is confident that it will achieve "frictionless quantum computing", a turning point at which the barrier to entry into quantum development will be greatly tamed. By then, we envision that developers across all levels of the quantum computing stack will rely upon on our advanced hardware with a cloud-based API, working seamlessly with high performance computing resources to push the limits of computation overall—and include quantum computation as a natural component of their existing computation pipelines. And a decade from now, in the 2030s, IBM hopes that our hardware and software prowess will reach the extent that we will be able to run billions and trillions of quantum circuits without even realizing that we are doing so. That would be the era of practical, full-scale commercial quantum computers.
  19. IBM to reportedly cut 10,000 jobs from its IT services unit in Europe by Abhay Venkatesh IBM is reportedly planning to cut 10,000 jobs in the European region that will affect employees in the company’s legacy IT services business. The decision was supposedly announced in a meeting with European labor representatives earlier this month. The change will reportedly affect close to 20% of employees in the region, with staff in the U.K. and Germany set to be impacted the most. Bloomberg reports that the layoffs are meant to help cut costs in the services unit that the company is expected to spin off. The firm is supposedly planning to focus on its “hybrid-cloud computing and artificial intelligence unit” that will help bring revenue growth. The report adds that the company has already been laying off employees this year but has not disclosed any numbers. In an email statement to Bloomberg, a spokesperson for IBM said that the firm’s “staffing decisions are made to provide the best support to [its] customers in adopting an open hybrid cloud platform and AI capabilities”. CFO James Kavanaugh said in an earnings call in October that the technology giant is “taking structural actions to simplify and streamline” its business, likely hitting at the potential changes. These changes will reportedly be the first big move by IBM CEO Arvind Krishna who took over the post from Ginni Rometty in April. A person familiar with the matter told the publication that the layoffs should be completed by the first half of 2021. Source: Bloomberg
  20. IBM, Red Hat and others want inclusive language in software by Paul Hill IBM, Red Hat and VMWare are among several companies that have come together to create the Inclusive Naming Initiative which aims to eliminate problematic language from projects and replace them with an agreed set of neutral terms. To do this, the initiative will define processes and tools to remove harmful language from projects. Some of the processes and tools which the Inclusive Naming Initiative will be creating include a comprehensive list of terms with replacements, language evaluation frameworks and templates, and infrastructure to aid the transition. Explaining the need for more inclusive words, the initiative says: “If software is truly meant to be inclusive and a place where anyone can participate, it must be welcoming to all. If words or phrases convey secondary unintended meanings to our audience (or are simply confusing!) we are potentially limiting participation in our projects, which is antithetical to this goal.” Initially, attention will be aimed at replacing the terms ‘master’, ‘slave’, ‘whitelist’ and ‘blacklist’ because these are the most visible and problematic across the industry. Over time, it will expand its scope to find replacements for other terms that reference mental health, gender, physical handicaps, and several other categories. In the future, it might also give tips to avoid colloquialisms that don’t translate into other languages very well or are a barrier to understanding. While some people may be against the changing of these terms, the Inclusive Naming Initiative argues that the neutral terms are more descriptive, for example, it says that ‘Denylist’ is more precise and more accurate than ‘blacklist’. Source: Inclusive Naming Initiative via Phoronix
  21. IBM and AMD join forces to improve security on their virtual machines powering AI workloads by Ather Fawaz The past couple of days have seen some important announcements in the domain of artificial intelligence (AI). Sony entered the field of AI-powered drones with the launch of Airpeak, and Silicon Valley-based startup Mipsology joined hands with Japanese software development firm OKI IDS to further AI-based image processing in Japan using Field-Programmable Gate Arrays (FGPAs). Now, there's another addition to this list. Today, IBM and AMD announced a multi-year joint development partnership under which the two firms will work together to extend and improve their security and AI offerings. At the heart of the agreement is the aim to further Confidential Computing, which is a technology that encrypts data running on virtual machines—while workloads are running—using hardware solutions. This provides an additional layer of security, inhibiting malicious actors from stealing data in the event of a break-in. The goal of the partnership is to ease hybrid cloud adoption for highly regulated businesses or organizations that are concerned about unauthorized access to data in use in the public cloud. The two firms aim to realize this idea with open-source software, standards, and system architectures in hybrid cloud environments, including AI and high-performance computing (HPC) workloads and others that use virtualization and encryption. "The commitment of AMD to technological innovation aligns with our mission to develop and accelerate the adoption of the hybrid cloud to help connect, secure and power our digital world", said Dario Gil, Director of IBM Research. Meanwhile the Executive Vice President and CTO of AMD, Mark Papermaster, claimed that "this agreement between AMD and IBM aligns well with our long-standing commitment to collaborating with leaders in the industry. AMD is excited to extend our work with IBM on AI, accelerating data center workloads, and improving security across the cloud."
  22. Microsoft partners with Nvidia, IBM, and more to release Adversarial ML Threat Matrix by Usama Jawad Microsoft is observing National Cyber Security Awareness Month (NCSAM) currently, and cybersecurity seems to be at the forefront for the firm. Over the past few weeks, the company has announced new initiatives to promote cybersecurity awareness, Zero Trust Deployment Center, and an offensive against the malicious Trickbot botnet. Now, it has released the Adversarial ML Threat Matrix framework in collaboration with various organizations such as IBM, Nvidia, MITRE, and more. Microsoft says that many security analysts believe that attacks against machine learning (ML) systems should be a concern for the future rather than right now, even though the Redmond tech giant's data suggest that this is not the case. Cyberattacks against commercial ML programs are becoming increasingly common because firms do not have the right tooling in place to protect these systems. To combat this growing threat, Microsoft has collaborated with MITRE and 11 other companies such as Nvidia, Bosch, IBM, and more to develop an open framework that organizes techniques that are used by malicious actors. Microsoft has clearly stated that the framework is aimed at security analysts and is similar in structure to the ATT&CK framework that the intended audience is already familiar with. Furthermore, it has also been seeded with known vulnerabilities that Microsoft and MITRE have noticed in real-world systems. The firm says that since this topic is popular in academic research, it is opening the industry-focused framework to the wider community. It is gaining insights from researchers at various universities, as well as its own tooling. Microsoft believes that its efforts will allow everyone to develop and deploy ML solutions securely. You can head over to the Adversarial ML Threat Matrix GitHub repository here to find out more about the initiative.
  23. QCE20: Here's what you can expect from Intel's new quantum computing research this week by Ather Fawaz The IEEE Quantum Week (QCE20) is a conference where academics, newcomers, and enthusiasts alike come together to discuss new developments and challenges in the field of quantum computing and engineering. Due to COVID-19 restrictions, this year's conference will be held virtually, starting today and running till October 16. Throughout the course of the event, QCE20 will host parallel tracks of workshops, tutorials, keynotes, and networking sessions by industry front-runners like Intel, Microsoft, IBM, and Zapata. From the pack, today we’ll peek into what Intel has in store for the IEEE Quantum Week. Particularly, we’ll be previewing Intel’s array of new papers on developing commercial-grade quantum systems. Image via Intel Designing high-fidelity multi-qubit gates using deep reinforcement learning Starting off, Intel will be presenting a paper in which researchers have employed a deep learning framework to simulate and design high-fidelity multi-qubit gates for quantum dot qubit systems. This research is interesting because quantum dot silicon qubits can potentially improve the scalability of quantum computers due to their small size. This paper also indicates that machine learning is a powerful technique in optimizing the design and implementation of quantum gates. A similar insight was used by another team at the University of Melbourne back in March in which the researchers used machine learning to pinpoint the spatial locations of phosphorus atoms in a silicon lattice to design better quantum chips and subsequently reduce errors in computations. Efficient quantum circuits for accurate state preparation of smooth, differentiable functions Next up, Intel's second paper proposes an algorithm that optimizes the loading of certain classes of functions, e.g. Gaussian and Probability distributions, which are frequently used for mapping real-world problems to quantum computers. By loading data faster in a quantum computer and increasing throughput, the researchers believe that we can save time and leverage the exponential compute power offered by quantum computers in practical applications. Image via Intel On connectivity-dependent resource requirements for digital quantum simulation of d-level particles One of the earliest and most useful applications of quantum computers is to simulate a quantum system of particles. Consider the scenario where the ground state of a particle is to be calculated to study a certain chemical process. Traditionally, this task usually involves obtaining the lowest eigenvalue from the corresponding eigenvectors of the states of a particle represented by a matrix known as the Hamiltonian. But this deceptively simple task grows exponentially for larger systems that have innumerable particles. Naturally, researchers have devised quantum algorithms for it. Intel’s paper highlights the development and research requirements of running such algorithms on small qubit systems. The firm believes that the insight garnered from these findings can have potential implications for designing qubit chips in the future while simultaneously making quantum computing more accessible. A BIKE accelerator for post-quantum cryptography While we’re still in the NISQ (Noisy Intermediate-Scale Quantum) era of quantum computers, meaning that perfect quantum computers with thousands of qubits running Shor’s algorithm are still a thing of the future, firms have already started preparing for a ‘quantum-safe’ future. One of the foreseeable threats posed by quantum computers is the ease with which they can factor large numbers, and hence threaten to break our existing standards of encryption. In this paper, researchers at Intel have aimed to address this concern. By presenting a design for a BIKE (Bit-flipping Key Encapsulation) hardware accelerator, today’s cryptosystems can be made resilient to quantum attacks. Another thing to note here is that this approach is also currently under consideration by the National Institute of Standards and Technology (NIST), so a degree of adoption and standardization might be on the cards in the future. Engineering the cost function of a variational quantum algorithm for implementation on near-term devices Addressing the prevalent issues of the NISQ era once again, this paper debuts a novel technique that helps quantum-classical hybrid algorithms run efficiently on small qubit systems. This technique can be handy in this era since most practical uses of quantum computers involve a hybrid setup in which a quantum computer is paired with a classical computer. To illustrate, the aforementioned problem of finding the ground state of a quantum particle can be solved by a Variational-Quantum-Eigensolver (VQE), which uses both classical and quantum algorithms to estimate the lowest eigenvalue of a Hamiltonian. But running such hybrid algorithms is difficult. But the new method to engineer cost functions outlined in this paper could allow small qubit systems to run these algorithms efficiently. Image via Intel Finally, on the penultimate day of the conference, Dr. Anne Matsuura, the Director of Quantum Applications and Architecture at Intel Labs, will be delivering a keynote titled “Quantum Computing: A Scalable, Systems Approach”. In it, Dr. Matsuura will be underscoring Intel’s strategy of taking a systems-oriented, workload-driven view of quantum computing to commercialize quantum computers in the NISQ era: “Quantum computing is steadily transitioning from the physics lab into the domain of engineering as we prepare to focus on useful, nearer-term applications for this disruptive technology. Quantum research within Intel Labs is making solid advances in every layer of the quantum computing stack – from spin qubit hardware and cryo-CMOS technologies for qubit control to software and algorithms research that will put us on the path to a scalable quantum architecture for useful commercial applications. Taking this systems-level approach to quantum is critical in order to achieve quantum practicality.” The research works outlined above accentuate Intel’s efforts to develop useful applications that are ready to run on near-term, smaller qubit quantum machines. They also put the tech giant alongside the ranks of IBM and Zapata that are working on the commercialization of quantum computers as well.
  24. Orquestra now in full release, supports quantum hardware from Honeywell, IBM, and Amazon by Ather Fawaz Back in April, Zapata, a quantum computing startup debuted Orquestra, its end-to-end, unified software platform for designing and running quantum circuits. Previously available in Early Access only, Zapata announced Orquestra's full commercial release today, making the software more accessible to enterprise, government, and academic teams. Image via Honeywell The full release brings with it some key developments. First, you will now have access to Honeywell's 6-qubit quantum computer, the System Model HØ, making it the first platform to provide what they call "value-added access to Honeywell’s system". This will allow you to run quantum workflows directly on the firm's quantum computer using Orquestra. Vis-à-vis the initiative, the President of Honeywell Quantum Solutions, Tony Uttley said: “Through our value-added partnership with Zapata, users now have the unique opportunity to run quantum workflows directly on the Honeywell system. This allows them to get familiar with our system’s all-to-all connectivity and differentiated capabilities." Orquestra now also integrates with quantum hardware from IBM and Amazon Braket, allowing you to run your algorithms across a range of quantum hardware. The inclusion of Amazon Braket extends your reach to trapped ion processors from IonQ, and superconducting quantum processors from Rigetti as well. The original cohort of features in Orquestra will be integrated into the full release too. You will have access to modules written in libraries like Cirq, Qiskit, PennyLane and PyQuil. Optimized open-source (VQE, QAOA) and proprietary (VQF) algorithms will also be available. Usage and feedback from the Early Access phase have fueled "major improvements to Orquestra’s features, integrations and interactions,” said Christopher Savoie, CEO and co-founder of Zapata. Further details can be found here.
  25. IBM says its latest POWER10 CPUs are up to 20x faster than last generation by Sayan Sen IBM has unveiled its latest generation of POWER CPUs dubbed, unsurprisingly, the POWER10. Like previous-gen POWER9, which launched nearly three years ago, the new CPUs are also designed to improve AI processing and are to be deployed as hybrid cloud computing solutions. For example, according to IBM's own benchmark numbers, AI inferencing performance for 8-bit integer operations can be up to a whopping 20x faster than POWER9 solutions. Not only that, single-precision FP32 performance and half-precision BFloat16 performance also see enormous gains of 10x and 15x, respectively. IBM says that the new POWER10 chips are especially optimized to work with the Red Hat OpenShift platform. With POWER10, IBM is moving down to 7nm and will be utilizing Samsung's 7nm EUV node. With this, alongside architectural improvements in the mix, IBM says that its new POWER10 processors are up to three times more energy-efficient than last gen's. A new memory sharing technology called Memory Inception is being introduced by the company which will enable several POWER10 CPUs in a cluster to utilize the same memory pool, potentially increasing processing efficiency, and in turn, saving power. In terms of security, POWER10 features hardware-accelerated AES encryption, as well as newer security protocols like quantum-safe cryptography and fully homomorphic encryption. For added security, container isolation is also present that will prevent breaches from overflowing into other non-affected containers in the same VM space. Availability wise, enterprises and other customers should be able to get their hands on POWER10 systems in the second half of 2021.