×

Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

  • Why ATM Bombs May Be Coming Soon To the United States

    HughPickens.com writes Nick Summers has an interesting article at Bloomberg about the epidemic of 90 ATM bombings that has hit Britain since 2013. ATM machines are vulnerable because the strongbox inside an ATM has two essential holes: a small slot in front that spits out bills to customers and a big door in back through which employees load reams of cash in large cassettes. "Criminals have learned to see this simple enclosure as a physics problem," writes Summers. "Gas is pumped in, and when it's detonated, the weakest part—the large hinged door—is forced open. After an ATM blast, thieves force their way into the bank itself, where the now gaping rear of the cash machine is either exposed in the lobby or inside a trivially secured room. Set off with skill, the shock wave leaves the money neatly stacked, sometimes with a whiff of the distinctive acetylene odor of garlic." The rise in gas attacks has created a market opportunity for the companies that construct ATM components. Several manufacturers now make various anti-gas-attack modules: Some absorb shock waves, some detect gas and render it harmless, and some emit sound, fog, or dye to discourage thieves in the act.

    As far as anyone knows, there has never been a gas attack on an American ATM. The leading theory points to the country's primitive ATM cards. Along with Mongolia, Papua New Guinea, and not many other countries, the U.S. doesn't require its plastic to contain an encryption chip, so stealing cards remains an effective, nonviolent way to get at the cash in an ATM. Encryption chip requirements are coming to the U.S. later this year, though. And given the gas raid's many advantages, it may be only a matter of time until the back of an American ATM comes rocketing off.

    279 comments | 7 hours ago

  • NVIDIA GTX 970 Specifications Corrected, Memory Pools Explained

    Vigile writes Over the weekend NVIDIA sent out its first official response to the claims of hampered performance on the GTX 970 and a potential lack of access to 1/8th of the on-board memory. Today NVIDIA has clarified the situation again, this time with some important changes to the specifications of the GPU. First, the ROP count and L2 cache capacity of the GTX 970 were incorrectly reported at launch (last September). The GTX 970 has 52 ROPs and 1792 KB of L2 cache compared to the GTX 980 that has 64 ROPs and 2048 KB of L2 cache; previously both GPUs claimed to have identical specs. Because of this change, one of the 32-bit memory channels is accessed differently, forcing NVIDIA to create 3.5GB and 0.5GB pools of memory to improve overall performance for the majority of use cases. The smaller, 500MB pool operates at 1/7th the speed of the 3.5GB pool and thus will lower total graphics system performance by 4-6% when added into the memory system. That occurs when games request MORE than 3.5GB of memory allocation though, which happens only in extreme cases and combinations of resolution and anti-aliasing. Still, the jury is out on whether NVIDIA has answered enough questions to temper the fire from consumers.

    113 comments | 2 days ago

  • DirectX 12 Lies Dormant Within Microsoft's Recent Windows 10 Update

    MojoKid writes After last Wednesday's Windows 10 event, early adopters and IT types were probably anxious for Microsoft to release the next preview build. Fortunately, it didn't take long as it came out on Friday, and it's safe to say that it introduced even more than many were anticipating (but still no Spartan browser). However, in case you missed it, DirectX 12 is actually enabled in this Windows 10 release, though unfortunately we'll need to wait for graphics drivers and apps that support it, to take advantage of DX 12 features and performance enhancements.

    133 comments | 3 days ago

  • Ask Slashdot: GPU of Choice For OpenCL On Linux?

    Bram Stolk writes So, I am running GNU/Linux on a modern Haswell CPU, with an old Radeon HD5xxx from 2009. I'm pretty happy with the open source Gallium driver for 3D acceleration. But now I want to do some GPGPU development using OpenCL on this box, and the old GPU will no longer cut it. What do my fellow technophiles from Slashdot recommend as a replacement GPU? Go NVIDIA, go AMD, or just use the integrated Intel GPU instead? Bonus points for open sourced solutions. Performance not really important, but OpenCL driver maturity is.

    109 comments | 4 days ago

  • NVIDIA Responds To GTX 970 Memory Bug

    Vigile writes Over the past week or so, owners of the GeForce GTX 970 have found several instances where the GPU was unable or unwilling to address memory capacities over 3.5GB despite having 4GB of on-board frame buffer. Specific benchmarks were written to demonstrate the issue and users even found ways to configure games to utilize more than 3.5GB of memory using DSR and high levels of MSAA. While the GTX 980 can access 4GB of its memory, the GTX 970 appeared to be less likely to do so and would see a dramatic performance hit when it did. NVIDIA responded today saying that the GTX 970 has "fewer crossbar resources to the memory system" as a result of disabled groups of cores called SMMs. NVIDIA states that "to optimally manage memory traffic in this configuration, we segment graphics memory into a 3.5GB section and a 0.5GB section" and that the GPU has "higher priority" to the larger pool. The question that remains is should this affect gamers' view of the GTX 970? If performance metrics already take the different memory configuration into account, then I don't see the GTX 970 declining in popularity.

    145 comments | 5 days ago

  • NVIDIA Launches New Midrange Maxwell-Based GeForce GTX 960 Graphics Card

    MojoKid writes NVIDIA is launching a new Maxwell desktop graphics card today, targeted at the sweet spot of the graphics card market ($200 or so), currently occupied by its previous gen GeForce GTX 760 and older GTX 660. The new GeForce GTX 960 features a brand new Maxwell-based GPU dubbed the GM206. NVIDIA was able to optimize the GM206's power efficiency without moving to a new process, by tweaking virtually every part of the GPU. NVIDIA's reference specifications for the GeForce GTX 960 call for a base clock of 1126MHz and a Boost clock of 1178MHz. The GPU is packing 1024 CUDA cores, 64 texture units, and 32 ROPs, which is half of what's inside their top-end GeForce GTX 980. The 2GB of GDDR5 memory on GeForce GTX 960 cards is clocked at a speedy 7GHz (effective GDDR5 data rate) over a 128-bit memory interface. The new GeForce GTX 960 is a low-power upgrade for gamers with GeForce GTX 660 class cards or older that make up a good percentage of the market now. It's usually faster than the previous generation GeForce GTX 760 card but, depending on the game title, can trail it as well, due to its narrower memory interface.

    114 comments | about a week ago

  • First Look At Dell Venue 8 7000 and Intel's Moorefield Atom Performance

    MojoKid writes: Dell has been strategically setting-up their new Venue 8 7000 tablet for cameo appearances over the past few months, starting back at Intel Developer's Forum in September of last year, then again at Dell World in November and at CES 2015. What's interesting about this new device, in addition to Intel's RealSense camera is its Atom Z3580 quad-core processor, which is based on Intel's latest Moorefield architecture. Moorefield builds upon Intel's Cherrytrail Atom feature set and offers two additional CPU cores with up to a 2.3GHz clock speed, an enhanced PowerVR 6430 GPU and support of faster LPDDR3-1600 memory. Moorefield is also built for Intel's XMM 7260 LTE modem platform, which supports carrier aggregation. Overall, Moorefield looks solid, with performance ahead of a Snapdragon 801 but not quite able to catch the 805, NVIDIA Tegra K1 or Apple's A8X in terms of graphics throughput. On the CPU side, Intel's beefed-up quad-core Atom variant shows well.

    22 comments | about a week ago

  • Chrome For OS X Catches Up With Safari's Emoji Support

    According to The Next Web, Emoji support has landed in the latest developer builds of Chrome for OS X, meaning that emoji can be seen on websites and be entered into text fields for the first time without issues. ... Users on Safari on OS X could already see emoji on the Web without issue, since Apple built that in. The bug in Chrome was fixed on December 11, which went into testing on Chrome’s Canary track recently. From there, we can expect it to move to the consumer version of Chrome in coming weeks.

    104 comments | about two weeks ago

  • Intel 5th Gen Core Series Performance Preview With 2015 Dell XPS 13

    MojoKid writes: Intel's strategically timed CES 2015 launch of their new 5th Gen Core Series processors for notebooks was met with a reasonably warm reception, though it's always difficult to rise above the noise of CES chatter. Performance claims for Intel's new chip promise major gains in graphics and more modest increases in standard compute applications. However, the biggest bet Intel placed on the new Broadwell-U architecture is performance-per-watt throughput and battery life in premium notebook products that are now in production with major OEM partners. A few manufacturers were early out of the gate with new Core i5 5XXX series-based machines, however, none of the major players caught the same kind of buzz that Dell received, with the introduction of their new XPS 13 Ultrabook with its near bezel-less 13-inch WQHD (3200X1800) display. As expected, the Core i5-5200U in this machine offered performance gains of anywhere from 10 to 20 percent, in round numbers, depending on the benchmark. In gaming and graphics testing is where the new 5200U chip took the largest lead over the previous gen Core i5-4200U CPU, which is one of the most common processors found in typical ultrabook style 13-inch machines.

    97 comments | about three weeks ago

  • AMD, Nvidia Reportedly Tripped Up On Process Shrinks

    itwbennett writes: In the fierce battle between CPU and GPU vendors, it's not just about speeds and feeds but also about process shrinks. Both Nvidia and AMD have had their move to 16nm and 20nm designs, respectively, hampered by the limited capacity of both nodes at manufacturer TSMC, according to the enthusiast site WCCFTech.com. While AMD's CPUs are produced by GlobalFoundaries, its GPUs are made at TSMC, as are Nvidia's chips. The problem is that TSMC only has so much capacity and Apple and Samsung have sucked up all that capacity. The only other manufacturer with 14nm capacity is Intel and there's no way Intel will sell them some capacity.

    230 comments | about three weeks ago

  • Quake On an Oscilloscope

    An anonymous reader writes: Developer Pekka Väänänen has posted a fascinating report on how he got Quake running on an oscilloscope (video link). Obviously, the graphic details gets stripped down to basic lines, but even then, you need to cull any useless or unseen geometry to make things run smoothly. He says, "To cull the duplicates a std::unordered_set of the C++ standard library is used. The indices of the triangle edges are saved in pairs, packed in a single uint64_t, the lower index being first. The set is cleared between each object, so the same line could still be drawn twice or more if the same vertices are stored in different meshes. Before saving a line for end-of-the-frame-submit, its indices in the mesh are checked against the set, and discarded if already saved this frame. At the end of each frame all saved lines are checked against the depth buffer of the rendered scene. If a line lies completely behind the depth buffer, it can be safely discarded because it shouldn't be visible."

    71 comments | about a month ago

  • Phoronix Lauds AMD's Open Source Radeon Driver Progress For 2014

    Phoronix has taken an in-depth look at progress on AMD's open source Radeon driver, and declares 2014 to have been a good year. There are several pages with detailed benchmarks, but the upshot is overwhelmingly positive: Across the board there's huge performance improvements to find out of the open-source AMD Linux graphics driver when comparing the state at the end of 2013 to the current code at the end of this year. The performance improvements and new features presented (among them are OpenMAX / AMD video encode, UVD for older AMD GPUs, various new OpenGL extensions, continued work on OpenCL, power management improvements, and the start of open-source HSA) has been nothing short of incredible. Most of the new work benefits the Radeon HD 7000 series and newer (GCN) GPUs the most but these tests showed the Radeon HD 6000 series still improving too. ... Coming up before the end of the year will be a fresh comparison of these open-source Radeon driver results compared to the newest proprietary AMD Catalyst Linux graphics driver.

    44 comments | about a month ago

  • Linux 3.19 Kernel To Start 2015 With Many New Features

    An anonymous reader writes Linux 3.18 was recently released, thus making Linux 3.19 the version under development as the year comes to a close. Linux 3.19 as the first big kernel update of 2015 is bringing in the new year with many new features: among them are AMDKFD HSA kernel driver, Intel "Skylake" graphics support, Radeon and NVIDIA driver improvements, RAID5/6 improvements for Btrfs, LZ4 compression for SquashFS, better multi-touch support, new input drivers, x86 laptop improvements, etc.

    66 comments | about a month ago

  • Samsung Galaxy Note Edge Review

    MojoKid writes Differentiation is difficult in the smartphone market these days. Larger screens, faster processors, additional sensors and higher resolution cameras, all are nice upgrades but are only iterative, especially when you consider the deluge of products that come to market. True innovation is coming along with less frequency and Samsung, more so perhaps than some other players, is guilty of punching out so many different phone models that it's hard not to gloss over new releases. However, the new Samsung Galaxy Note Edge may offer something truly useful and innovative with its supplementary 160 pixel curved edge display. The Note Edge is based on the same internal platform as the Galaxy Note 4, and features a 2.7GHz Qualcomm Snapdragon 805 SoC with Adreno 420 graphics and 3GB of RAM. What makes the Galaxy Note Edge so different from virtually all other smartphones on the market is its curved edge display and what Samsung calls its "revolving UI" that offers app shortcuts, status updates, data feeds and features all on its own, but integrated with the rest of the UI on the primary display. You can cycle through various "edge panels" as Samsung calls them, like shortcuts to your favorite apps, a Twitter ticker, news feeds, and a tools panel for quick access to the alarm clock, stop-watch, a flashlight app, audio recorder and even a digital ruler. The Galaxy Note Edge may not be for everyone, but Samsung actually took curved display technology and built something useful out of it."

    75 comments | about a month ago

  • Touring a Carnival Cruise Simulator: 210 Degrees of GeForce-Powered Projection

    MojoKid writes Recently, Carnival cruise lines gave tours of their CSMART facility in Almere, the Netherlands. This facility is one of a handful in the world that can provide both extensive training and certification on cruise ships as well as a comprehensive simulation of what it's like to command one. Simulating the operation of a Carnival cruise ship is anything but simple. Let's start with a ship that's at least passingly familiar to most people — the RMS Titanic. At roughly 46,000 tons and 882 feet long, she was, briefly, the largest vessel afloat. Compared to a modern cruise ship, however, Titanic was a pipsqueak. As the size and complexity of the ships has grown, the need for complete simulators has grown as well. The C-SMART facility currently sports two full bridge simulators, several partial bridges, and multiple engineering rooms. When the Costa Concordia wrecked off the coast of Italy several years ago, the C-SMART facility was used to simulate the wreck based on the black boxes from the ship itself. When C-SMART moves to its new facilities, it'll pick up an enormous improvement in processing power. The next-gen visual system is going to be powered by104 GeForce Grid systems running banks of GTX 980 GPUs. C-SMART executives claim it will actually substantially reduce their total power consumption thanks to the improved Maxwell GPU. Which solution is currently in place was unclear, but the total number of installed systems is dropping from just over 500 to 100 rackmounted units.

    42 comments | about a month and a half ago

  • Forbes Blasts Latests Windows 7 Patch as Malware

    Forbes contributor Jason Evangelho has nothing good to say about a recent Windows 7 patch that's causing a range of trouble for some users. He writes: If you have Windows 7 set to automatically update every Tuesday, it may be to permanently disable that feature. Microsoft has just confirmed that a recent update — specifically KB 3004394 — is causing a range of serious problems and recommends removing it. The first issue that caught my attention, via AMD’s Robert Hallock, is that KB 3004394 blocks the installation or update of graphics drivers such as AMD’s new Catalyst Omega. Nvidia users are also reporting difficulty installing GeForce drivers, though I can’t confirm this personally as my machines are all Windows 8.1. Hallock recommended manually uninstalling the update, advice now echoed officially by Microsoft. More troubles are detailed in the article; on the upside, Microsoft has released a fix.

    230 comments | about a month and a half ago

  • LG To Show Off New 55-Inch 8K Display at CES

    MojoKid writes One of the most in-your-face buzzwords of the past year has been "4K," and there's little doubt that the forthcoming CES show in early January will bring it back in full force. As it stands today, 4K really isn't that rare, or expensive. You can even get 4K PC monitors for an attractive price. There does remain one issue, however; a lack of 4K content. We're beginning to see things improve, but it's still slow going. Given that, you might imagine that display vendors would hold off on trying to push that resolution envelope further – but you just can't stop hardware vendors from pushing the envelope. Earlier this year, both Apple and Dell unveiled "5K" displays that nearly doubled the number of pixels of 4K displays. 4K already brutalizes top-end graphics cards and lacks widely available video content, and yet here we are looking at the prospect of 5K. Many jaws dropped when 4K was first announced, and likewise with 5K. Now? Well, yes, 8K is on its way. We have LG to thank for that. At CES, the company will be showing-off a 55-inch display that boasts a staggering 33 million pixels — derived from a resolution of 7680x4320. It might not be immediately clear, but that's far more pixels than 4K, which suggests this whole "K" system of measuring resolutions is a little odd. On paper, you might imagine that 8K has twice the pixels of 4K, but instead, it's 4x.

    179 comments | about a month and a half ago

  • AMD Offers a Performance Boost, Over 20 New Features With Catalyst Omega Drivers

    MojoKid writes: AMD just dropped its new Catalyst Omega driver package that is the culmination of six months of development work. AMD Catalyst Omega reportedly brings over 20 new features and a wealth of bug fixes to the table, along with performance increases both on AMD Radeon GPUs and integrated AMD APUs. Some of the new functionality includes Virtual Super Resolution, or VSR. VSR is "game- and engine-agnostic" and renders content at up to 4K resolution, then displays it at a resolution that your monitor actually supports. AMD says VSR allows for increased image quality, similar in concept to Super Sampling Anti-Aliasing (SSAA). Another added perk of VSR is the ability to see more content on the screen at once. To take advantage of VSR, you'll need a Radeon R9 295X2, R9 290X, R9 290, or R9 285 discrete graphics card. Both single- and multi-GPU configurations are currently supported. VSR is essentially AMD's answer to NVIDIA's DSR, or Dynamic Super Resolution. In addition, AMD is claiming performance enhancements in a number of top titles with these these new drivers. Reportedly, as little as 6 percent improvement in performance in FIFA Online to as much as a 29 percent increase in Batman: Arkham Origins can be gained when using an AMD 7000-Series APU, for example. On discrete GPUs, an AMD Radeon R9 290X's performance increases ranged from 8 percent in Grid 2 to roughly 16 percent in Bioshock Infinity.

    73 comments | about a month and a half ago

  • Just-Announced X.Org Security Flaws Affect Code Dating Back To 1987

    An anonymous reader writes Some of the worst X.Org security issues were just publicized in an X.Org security advisory. The vulnerabilities deal with protocol handling issues and led to 12 CVEs published and code dating back to 1987 is affected within X11. Fixes for the X Server are temporarily available via this Git repository.

    172 comments | about 1 month ago

  • Spectrum Vega: A Blast From the Past

    mikejuk writes A new games console is being launched based on the classic Sinclair ZX Spectrum from the 80s. Within days of the start of its Indiegogo campaign all of the 1000 Limited Edition Spectrum Vegas had been claimed but there is still the chance to get your hands on one of the second batch. The Sinclair Spectrum Vega is really retro in the sense that it plugs into a TV, thus avoiding the need for a monitor, and comes complete with around 1,000 games built-in. Games are accessed through a menu based system, and once selected load automatically, taking the player directly into the game play mode. This is very different from the original Spectrum with its rubber-topped keyboard and BASIC interface. If you have existing Spectrum games you'd like to play, you can use an SD card to load them onto the Vega, though the current publicity material doesn't give much clue as to how you go from ancient cassette tape to SD card. As for programming new games, there are ZX Spectrum emulators for Windows that are free and ready to use.

    110 comments | about 2 months ago

Slashdot Login

Need an Account?

Forgot your password?