Sunday, 18 February 2018

Microsoft: We're Developing Blockchain ID System Starting With Our Authenticator App

Microsoft has revealed its plans to use blockchain distributed-ledger technologies to securely store and manage digital identities, starting with an experiment using the Microsoft Authenticator app. From a report: Microsoft reckons the technology holds promise as a superior alternative to people granting consent to dozens of apps and services and having their identity data spread across multiple providers. It highlights that with the existing model people don't have control over their identity data and are left exposed to data breaches and identity theft. Instead, people could store, control and access their identity in an encrypted digital hub, Microsoft explained. To achieve this goal, Microsoft has for the past year been incubating ideas for using blockchain and other distributed ledger technologies to create new types of decentralized digital identities.
Share on Google+

Read more of this story at Slashdot.



Read the full article here by Slashdot

Photograph of Single Atom Captured with a Plain Old Camera

The Engineering and Physical Sciences Research Council awarded a remarkable photograph its overall prize in science photography. The subject of the photograph? A single atom visible to the naked eye. Well, perhaps not exactly the naked eye, but without a microscope. In the picture above (click here to enlarge), the atom is that pale blue dot between the two needle-like structures.

You probably learned in school that you couldn’t see a single atom, and that’s usually true. But [David Nadlinger] from the University of Oxford, trapped a positively charged strontium atom in an ion trap and then irradiated it with a blue-violet laser. The atom absorbs and reemits the light, and a camera can pick up the light, creating a one-of-a-kind photograph. The camera was a Canon 5D Mk II with a 50mm f/1.8 lens — a nice camera, but nothing too exotic.

The ion trap keeps the single atom balanced between two small needle points about 2 millimeters apart. [Nadlinger] did some math that convinced him the photograph could be possible and made it a reality on a Sunday afternoon. The pale dot isn’t especially spectacular by itself, but when you realize that it is the visual effect of a single atom, it is mind-blowing. Turns out, the lab has taken some similar photographs in the past. They don’t remember who took it, but they have a picture of 9 calcium-43 ions trapped, that you can seen below. The ions are 10 microns apart and at an effective temperature of 0.001 degrees Kelvin.

Other winning photographs included patterns on a soap bubble, an EEG headset in use, and microbubbles used to deliver drugs. There’s also an underwater robot, a machine for molecular beam epitaxy that looks like a James Bond villain’s torture device, and lattices made with selective laser melting 3D printing.

If you want to look at atoms from the comfort of your own home, maybe you should build an STM. You might even try NIST’s improved atom probe while you are at it. Just remember you can’t trust atoms. They make up everything.

Photo credit: David Nadlinger



Read the full article here by Hack a Day

Saturday, 17 February 2018

Google Exposes How Malicious Sites Can Exploit Microsoft Edge

Google's Project Zero team has published details of an unfixed bypass for an important exploit-mitigation technique in Edge. From a report: The mitigation, Arbitrary Code Guard (ACG), arrived in the Windows 10 Creators Update to help thwart web attacks that attempt to load malicious code into memory. The defense ensures that only properly signed code can be mapped into memory. However, as Microsoft explains, Just-in-Time (JIT) compilers used in modern web browsers create a problem for ACG. JIT compilers transform JavaScript into native code, some of which is unsigned and runs in a content process. To ensure JIT compilers work with ACG enabled, Microsoft put Edge's JIT compiling in a separate process that runs in its own isolated sandbox. Microsoft said this move was "a non-trivial engineering task." "The JIT process is responsible for compiling JavaScript to native code and mapping it into the requesting content process. In this way, the content process itself is never allowed to directly map or modify its own JIT code pages," Microsoft says. Google's Project Zero found an issue is created by the way the JIT process writes executable data into the content process.
Share on Google+

Read more of this story at Slashdot.



Read the full article here by Slashdot

Logstash 6.2.0 Release Improves Open Source Data Processing Pipeline

Logstash 6.2.0 Release Improves Open Source Data Processing Pipeline

Feb 16, 2018, 13:00 (0 Talkback[s]) (Other stories by Sean Michael Kerner)

Many modern enterprises have adopted the ELK (Elasticsearch, Logstash, Kibana) stack to collect, process, search and visualize data.

At the core of the ELK stack is the open-source Logstash project which defines itself as a server-side data processing pipeline - basically it helps to collect logs and then send them to a users' "stash" for searching, which in many cases is Elasticsearch.

Complete Story

Related Stories:



Read the full article here by Linux Today

Why Red Hat Invested $250M in CoreOS to Advance Kubernetes

Why Red Hat Invested $250M in CoreOS to Advance Kubernetes

Feb 16, 2018, 14:00 (0 Talkback[s]) (Other stories by Sean Michael Kerner)

For the last three years or so, Red Hat has been on a collision course with CoreOS, with both firms aiming to grow their respective Kubernetes platform. On Jan. 30 the competition between the two firms ended, with CoreOS agreeing to be acquired by Red Hat in a $250 million deal.

CoreOS didn't start out as a Kubernetes platform vendor, but then again neither did Red Hat.

Complete Story

Related Stories:



Read the full article here by Linux Today

Friday, 16 February 2018

Juste quelques pubs


Tous les dimanches, recevez non seulement les 5 dessins de la semaine, mais également notre sélection de vidéos, de gifs et d’articles qui nous on fait marrer et qui devraient vous faire marrer aussi !

A quoi ça ressemble ?

Ça vous donne envie ? Abonnez-vous :)

Parole de codeur, votre email sera uniquement utilisé pour la newsletter !


Read the full article here by CommitStrip

Saturday, 10 February 2018

Valve Has Hired Another Open-Source Linux GPU Driver Developer

Valve has onboarded another open-source Linux graphics driver developer.

Joining the work by Timothy Arceri, Andres Rodriguez, Samuel Pitoiset, and others working on the open-source Linux graphics stack while being funded by Valve, Daniel Schürmann is the company's latest hire.

Daniel Schürmann is this new name to Linux graphics contributions. He began his Valve work by contributing some

RADV patches

for this Radeon Vulkan driver in Mesa.

The German Linux developer has contributed to the Mixxx DJ mixing software, Cinnamon, and other open-source projects.

It turns out there are two Daniel Schürmanns into Linux/open-source development, not to be confused with the desktop developer, this Daniel from TU Berlin had written his master thesis on OpenMP offloading using OpenCL and SPIR-V. You can check out his thesis

here

.

Valve's Pierre-Loup Griffais has

confirmed

Daniel Schürmann is part of Valve's open-source graphics group.

It will be great to see what open-source GPU driver improvements they deliver this year.



Read the full article here by Phoronix

Friday, 9 February 2018

VLC 3.0 Released

As

expected

, the VLC 3.0 media player is now available!

"

This release is the result of more than three year of volunteer work, add numerous new features, fixes more than 1500 bugs and show more than 20.000 commits. All platform share the same code now. VLC 3.0 is a massive change in the VLC core,

" wrote Jean-Baptiste Kempf in the press announcement.

VLC 3.0 adds 360 video support, Direct HDR on Windows, HD Audio codec pass-through Google Chromecast support, better hardware decoding, HTTP 2.0 support, better UPnP, adaptive streaming, initial work on Wayland support, optional systemd support, zero-copy GStreamer video decoding, VLC on Linux now defaults to OpenGL rather than X-Video, countless other improvements.

VLC 3.0 can be downloaded from

VideoLAN.org

.



Read the full article here by Phoronix

Facebook Is Testing a Dislike Button

Ever since the inception of the Like button, Facebook users have been asking for a "dislike" button. Today, Facebook is testing a "downvote" button with certain users in the comment section of posts within Facebook groups and on old Facebook memories content. The Daily Beast reports: The feature appears to give users the ability to downrank certain comments. This is the first time Facebook has tested anything similar to a "dislike" button and it could theoretically allow for content that's offensive or relevant to be pushed to the bottom of a comment feed. In 2016, citing Facebook executives, Bloomberg said a dislike button "had been rejected on the grounds that it would sow too much negativity" to the platform. It's unclear how widely the dislike button is being tested. Facebook regularly tests features with small subsets of users that never end up rolling out to the broader public. Most users currently are only able to either Like or Reply to comments in a thread. The downvote option could have radical implications on what types of discussions and comments flourish on the platform. While it could theoretically be used to de-rank inflammatory or problematic comments, it could also easily be used as a tool for abuse.
Share on Google+

Read more of this story at Slashdot.



Read the full article here by Slashdot

Sunday, 4 February 2018

Crowdfunding Campaign Seeks a Fully Open Source Alternative to Citrix XenServer

"Free/libre and 100% community backed version of XenServer," promises a new Kickstarter page, adding that "Our first prototype (and proof of concept) is already functional." Currently, XenServer is a turnkey virtualization platform, distributed as a distribution (based on CentOS). It comes with a feature rich toolstack, called XAPI. The vast majority of XenServer code is Open Source. But since XenServer 7.3, Citrix removed a lot of features from it. The goal of XCP-ng is to make a fully community backed version of XenServer, without any feature restrictions. We also aim to create a real ecosystem, not depending on one company only. Simple equation: the more we are, the healthier is the environment. The campaign reached its fundraising goal within a few hours, reports long-time Slashdot reader NoOnesMessiah, and within three days they'd already raised four times the needed amount and began unlocking their stretch goals.
Share on Google+

Read more of this story at Slashdot.



Read the full article here by Slashdot

id Software co-founders confirm that its biggest games’ heroes are all related

Malware Exploiting Spectre, Meltdown CPU Flaws Emerges

wiredmikey quotes SecurityWeek: Researchers have discovered more than 130 malware samples designed to exploit the recently disclosed Spectre and Meltdown CPU vulnerabilities. While a majority of the samples appear to be in the testing phase, we could soon start seeing attacks... On Wednesday, antivirus testing firm AV-TEST told SecurityWeek that it has obtained 139 samples from various sources, including researchers, testers and antivirus companies... Fortinet, which also analyzed many of the samples, confirmed that a majority of them were based on available proof of concept code. Andreas Marx, CEO of AV-TEST, believes different groups are working on the PoC exploits to determine if they can be used for some purpose. "Most likely, malicious purposes at some point," he said.
Share on Google+

Read more of this story at Slashdot.



Read the full article here by Slashdot

PUBG Anti-Cheat Update Coming Next Week

The Head of Service Management & Anti-Cheat at Playerunknown's Battlegrounds have announced they they have developed a new anti-cheat solution and will be deploying an early version of it on the servers next week. The solution was developed in house, and has been being tested on their test servers. The main focus of the new system is blocking unauthorized programs that hook into the game and transform game files. In addition, PUBG is upgrading the in-game report function, so it will allow them to investigate reported content faster and more accurately; File modification checking, any modification or deletion of game files may result in a ban; and they are cracking down on account sharing, no longer allowing family sharing on steam.

News Image

Sounds good to me. I have not played many rounds in PUBG, but I will say in the few I have, several ended suspiciously. THe primary foxus of this anti-cheat makes me worry for people that use things like RTSS, Afterburner, and the like for FPS reporting, we don't want another unjust ban wave happening. The comments section on the announcement implies that players have another idea to quell cheating.

The internally developed anti-cheat solution is planned to be upgraded steadily after the first implementation next week. As mentioned earlier, the eradication of cheat programs will not end with a few attempts and actions. In addition to the systems currently in development and already implemented, we are looking into a more effective system, and we will actively introduce any solutions that were confirmed to be reliable and accurate. We will continue taking firm measures against the developers, distributors and users of cheats. We promise you that we will do our best every day in our battle for a fair game environment.

Discussion



Read the full article here by [H]ardOCP News/Article Feed

Friday, 2 February 2018

eBay Is Dumping PayPal For Dutch Rival Adyen

schwit1 shares a report from CNN: EBay, one of the world's biggest online marketplaces, announced Wednesday that it's dropping PayPal as its main partner for processing payments in favor of Dutch company Adyen. In 2002, eBay paid $1.5 billion to buy PayPal, an online payments company whose founders include Silicon Valley heavyweights Elon Musk and Peter Thiel. It proved to be a very successful investment. When eBay spun off PayPal in 2015 -- something investors and analysts had urged it to do -- the payments company's market value was close to $50 billion. It's now above $100 billion. Based in Amsterdam, Adyen already works with other big tech companies including Uber and Netflix. It says it handles more than 200 different payment methods and over 150 currencies. The shift will start gradually in North America later this year and eBay expects most marketplace customers around the world to be using the new system in 2021.
Share on Google+

Read more of this story at Slashdot.



Read the full article here by Slashdot

Why Alexa Won't Light Up During Amazon's Super Bowl Ad

Bloomberg: Amazon.com is advertising its Alexa-powered speakers in the big game on Sunday. It's an amusing 90 seconds that features celebrities like Gordon Ramsay, Rebel Wilson, Anthony Hopkins, Cardi B and the world's wealthiest man, Jeff Bezos himself. The word "Alexa" is uttered 10 times during the Super Bowl spot, but thankfully, the Amazon Echo in your living room isn't going to perk up and try to respond. Bezos and company have evidently been thinking about this problem for a long time, before the Echo was even introduced. A September 2014 Amazon patent titled "Audible command filtering" describes techniques to prevent Alexa from waking up "as part of a broadcast watched by a large population (such as during a popular sporting event)," annoying customers and overloading Amazon's servers with millions of simultaneous requests. The patent broadly describes two techniques. The first calls for transmitting a snippet of a commercial to Echo devices before it airs. Then the Echo can compare live commands to the acoustic fingerprint of the snippet to determine whether the commands are authentic. The second tactic describes how a commercial itself could transmit an inaudible acoustic signal to tell Alexa to ignore its wake word.
Share on Google+

Read more of this story at Slashdot.



Read the full article here by Slashdot

DuckDuckGo CEO: 'Google and Facebook Are Watching Our Every Move Online. It's Time To Make Them Stop'

An anonymous reader shares a report from CNBC, written by Gabriel Weinberg, CEO and founder of DuckDuckGo: You may know that hidden trackers lurk on most websites you visit, soaking up your personal information. What you may not realize, though, is 76 percent of websites now contain hidden Google trackers, and 24 percent have hidden Facebook trackers, according to the Princeton Web Transparency & Accountability Project. The next highest is Twitter with 12 percent. It is likely that Google or Facebook are watching you on many sites you visit, in addition to tracking you when using their products. As a result, these two companies have amassed huge data profiles on each person, which can include your interests, purchases, search, browsing and location history, and much more. They then make your sensitive data profile available for invasive targeted advertising that can follow you around the Internet. [...] So how do we move forward from here? Don't be fooled by claims of self-regulation, as any useful long-term reforms of Google and Facebook's data privacy practices fundamentally oppose their core business models: hyper-targeted advertising based on more and more intrusive personal surveillance. Change must come from the outside. Unfortunately, we've seen relatively little from Washington. Congress and federal agencies need to take a fresh look at what can be done to curb these data monopolies. They first need to demand more algorithmic and privacy policy transparency, so people can truly understand the extent of how their personal information is being collected, processed and used by these companies. Only then can informed consent be possible. They also need to legislate that people own their own data, enabling real opt-outs. Finally, they need to restrict how data can be combined including being more aggressive at blocking acquisitions that further consolidate data power, which will pave the way for more competition in digital advertising. Until we see such meaningful changes, consumers should vote with their feet.
Share on Google+

Read more of this story at Slashdot.



Read the full article here by Slashdot

Thursday, 1 February 2018

AMD to Ramp up GPU Production, But RAM a Limiting Factor

One of the more tricky issues revolving around the GPU shortages of the past several months has been the matter of how to address the problem on the GPU supply side of matters. While the crux of the problem has been a massive shift in demand driven by a spike in cryptocurrency prices, demand has also not tapered off like many of us would have hoped. And while I hesitate to define the current situation as the new normal, if demand isn’t going to wane then bringing video card prices back down to reasonable levels is going to require a supply-side solution.

This of course sounds a lot easier than it actually is. Ignoring for the moment that GPU orders take months to process – there are a lot of steps in making a 16nm/14nm FinFET wafer – the bigger risk is that cryptocurrency-induced GPU demand is not stable. Ramping up GPU production means gambling that demand will stay high enough long enough to absorb the additional GPUs, and then not immediately contract and have the market flooded with used video cards. The latter being an important point that AMD got burnt on the last time this happened, when the collapse of the cryptocurrency-prices and the resulting demand for video cards resulted in the market becoming flooded with used Hawaii (290/390 series) cards.

Getting to the heart of matters then, in yesterday’s Q&A session for their Q4’2017 earnings call, an analyst asked AMD about the current GPU supply situation and whether AMD would be ramping up GPU production. The answer, much to my surprise, was yes. But with a catch.

Q: I just had a question on crypto, I mean if I look at the amount of hash compute being added to Ethereum in January I mean it's more than the whole of Q4, so we have seen a big start to the Q1. […] And is there any sort of acute shortages here, I man can your foundry partners do they have the capacity to support you with a ramp of GPUs at the moment and is there enough HBM2 DRAM to source as well?

A: Relative to just where we are in the market today, for sure the GPU channel is lower than we would like it to be, so we are ramping up our production. At this point we are not limited by silicon per se, so our foundry partners are supplying us, there are shortages in memory and I think that is true across the board, whether you are talking about GDDR5, or you’re talking about high bandwidth memory. We continue to work through that, with our memory partners and that will be certainly one of the key factors as we go through 2018.

So yes, AMD is ramping up GPU production. Which is a surprising move since they were burnt the last time they did this. At the same time however, while cryptocurrency demand has hit both major GPU manufacturers, AMD has been uniquely hit as they’re a smaller player less able to absorb rapid changes in demand, and, more importantly, their GPUs are better suited for the task. AMD’s tradition of offering more memory bandwidth and more raw FLOPS than NVIDIA at any competing price point, coupled with some meaningful architectural differences, means that their GPUs are in especially high demand by cryptocurrency miners.

But perhaps the more interesting point here isn’t that AMD is increasing their GPU production, but why they can only increase it by so much. According to the company, they’re actually RAM-limited. They can make more GPUs, but they don’t have enough RAM – be it GDDR5 or HBM2 – to equip all of the cards AMD and board partners would like to make.

This is an interesting revelation, as this is the first time memory shortages have been explicitly identified as an issue in this latest run-up. We’ve known that the memory market is extremely tight due to demand – with multiple manufacturers increasing their RAM prices and diverting GDDR5 production over to DDR4 – but only now is that catching up with video card production to the point that current GDDR5 production levels are no longer “enough”. Of course RAM of all types is still in high demand here at the start of 2018, so while memory manufacturers can reallocate some more production back to GDDR5, GPU and board vendors have to fight with both the server and mobile markets, both of which have their own booms in demand going on, and are willing to pay top dollar for the RAM they need.


GDDR5: The Key To Digital Gold

In a sense the addition of cryptocurrency to the mix of computing workloads has created a perfect storm in an industry that was already dealing with RAM shortages. The RAM market is in the middle of a boom right now – part of its traditional boom/bust cycle – and while it will eventually abate as demand slips and more production gets built, for the moment cryptocurrency mining has just added yet more demand for RAM that isn’t there. Virtually all supply/demand problems can be solved through higher prices – at some point, someone has to give up – but given the trends we’ve seen so far, GPU users are probably the most likely to suffer, as traditionally the GPU market has been built on offering powerful processors paired with plenty of RAM for paltry prices. Put another way, even if the GPU supply situation were resolved tomorrow and there were infinite GPUs for all, RAM prices would be a bottleneck that kept video card prices from coming back down to MSRP.

With all that said, however, AMD’s brief response in their earnings call has been the only statement of substance they’ve made on the matter. So while the company is (thankfully) ramping up GPU production, they haven’t – and are unlikely to ever – disclose just how many more GPUs that is, or for that matter how much RAM they expect they and partners can get for those new GPUs. So while any additional production will at least help the current situation to some extent, I would caution against getting too hopeful about AMD’s ramp-up bringing the video card shortage to an end.



Read the full article here by AnandTech Article Channel

Red Hat buys the creator of a Chrome-based OS for servers

The underpinnings of Chrome OS have found their way into the server room in a very roundabout way. Red Hat has acquired CoreOS, the creators of an operating system for containerized apps (Container Linux) that shares roots with both Google's Chromium OS project and Gentoo Linux. The $250 million deal promises to help Red Hat fulfill its dreams of helping people use open code to deploy apps in any environment they like, whether it's on a local network or multiple cloud services.

CoreOS has played a particularly major role in Kubernetes, the Google-built open platform for deploying those containerized apps. It's the second-largest contributor to the project beyond Google itself, Red Hat said. Additional tools like Tectonic and Quay have made it easier for big businesses to move and track apps.

You probably won't notice any of the Chrome OS influence at Red Hat. However, this shows just how far Google's web-centric platform has spread. Elements of an OS originally designed for frugal PCs will soon find their way into products from the open source world's biggest business provider -- Google definitely didn't anticipate that.

Source: Red Hat, CoreOS



Read the full article here by Engadget

Microsoft purchasing Valve would pay for itself in profits (and power)

Tuesday, 30 January 2018

Red Hat Is Acquiring CoreOS

Red Hat's betting big on the container game getting bigger with making public this afternoon their agreement to acquire CoreOS for $250 million USD.

CoreOS produces Container Linux that is the lightweight Linux distribution designed for software containers. Container Linux relies upon Gentoo / Chromium OS foundations, but it will be interesting to see how that changes now under Red Hat ownership.

In the Red Hat press release just sent out to us, Red Hat writes:

CoreOS’s technologies, including Tectonic and Container Linux, will combine with Red Hat’s already-broad Kubernetes and container-native product portfolio to further accelerate adoption and development of the industry’s leading hybrid cloud platform for modern application workloads.

Early to embrace containers and container orchestration, Red Hat contributes deeply to related open source communities, including acting as the second-leading Kubernetes contributor behind only Google. Red Hat also helps enable organizations around the world to embrace container-based applications, offering an enterprise-ready container solution in Red Hat OpenShift, the industry’s most comprehensive enterprise Kubernetes platform. With the addition of CoreOS, Red Hat amplifies its leadership at both the community and enterprise level around container-native and orchestration technologies.


Read the full article here by Phoronix