Neape is the unquestioned monarch of archival media. Although though innovations in fields like DNA(opens in new tab) and glass storage(opens in new tab) provide a peek into the future, there is presently no substitute to tape that can match its durability, affordability, and dependability. But, whether using the cloud or storing data on-site, organizations still confront a number of obstacles when it comes to data management and preservation. Our source at storage company Spectra Logic (opens in new tab), to learn more about how hybrid perpetual storage may address some of the most complex data issues that businesses face today.
How do you see flash evolving in the coming years? How will this affect the self-storage market?
NAND flash has been, and will likely continue to be, the storage industry’s fastest-growing technology. Both consumers and businesses appreciate its durability and speed, but the pursuit of ever-increasing storage capacity will be the primary focus of future innovation in the flash sector. Increases in the number of concurrent writes reduce the number of times a cell can be programmed, which in turn affects long-term flash capacity, making the switch from planar (2D) to 3D NAND unfeasible for future capacity advances. Reducing the size of the cells is another way to enhance flash capacity. Given that the industry’s goal is to reach 19 nm, and we’re already at 20 nm on the flash roadmap, this also appears to be a dead end.
Increasing the number of layers on a chip has the best potential to increase flash capacity; however, there are technical difficulties associated with fabricating devices with 100 or more layers. This is just one of the many reasons why no suppliers are discussing going beyond 136 levels in a single-stack part. That’s why we think string stacking will be the main driver of future capacity advances in flash. The string-stacking method involves stringing together many multi-layer flash dies to form a flash chip with an increased number of layers. This could mean that flash price drops slow down. The zone-based interface (allowing the physical arrangement of data into zones matching the performance needs of the data) will be used by system and cloud providers to extend the lifespan of their flash assets, improve performance, and increase capacity.
How has magnetic disk been affected by market factors?
The total number of disk drives delivered over the past four quarters has decreased by about 20%, to 255 million from 328 million a year ago. The introduction of flash technology into previously disk-only markets has contributed to this downturn. For instance, flash storage is used by the vast majority of computers nowadays. The latest generation of gaming consoles rely entirely on flash-based technology. The 2.5-inch disk market is contracting, but the 3.5-inch near line disk drive market is growing both in capacity and volume year over year. It’s marketed mostly to large IT shops and cloud providers and currently accounts for more than half of all disk sales. Focusing on developing a single product with a few variants has allowed disk firms to remain profitable as their legacy industry has eroded.
Despite numerous continuing developments and a lengthy LTO roadmap, tape appears to be here to stay. Where does tape go from here, and what are the most important innovations to remember?
Tape is most definitely not going anywhere. This format is ideal for long-term storage. And many of businesses have been saved from ransomware assaults thanks to the air gap capability of tape. Several of the largest companies in the world, including cloud service providers, still rely on tape. In reality, tape is experiencing a renaissance since no other storage medium now in use can match its density and low cost.
The demand for digital tape in the long-term archive market keeps rising, despite annual losses in the digital tape business for backing up primary disk systems (as IT backup has shifted to disk-based technologies). The advantages that tape technology offers in this area include a small footprint in terms of both square footage and energy consumption, great data integrity over extended periods of time, infinite scalability, and a significantly cheaper cost per gigabyte of storage than any other storage media.
The most used tape format is Linear Tape Open (LTO), and that won’t change anytime soon. LTO tape drives and media are guaranteed to work together thanks to the LTO consortium. The eighth version of this technology was released in 2018, and each cartridge has a natural (not compressed) capacity of 12 terabytes. The ninth generation, LTO-9, is scheduled for release later in 2021, with a capacity of 18TB (uncompressed)—a 50% increase over LTO-8. The LTO consortium offers a comprehensive LTO roadmap for future products, extending all the way to LTO-12 with its 144TB per media capacity.
The misconception that tape is difficult to manage is a long-standing problem. By giving applications an uniform network file interface and handling tape management itself, HSM (Hierarchical Storage Management) aimed to simplify the use of tape. An interface that allows for extended retrieval periods and the flexibility to request that an unlimited number of data entities be retrieved at once is necessary to make tape much easier to manage. There is now a de facto standard interface that, if backed by tape system manufacturers, would allow for a huge increase in the variety of uses for tape. All data on tape would be mapped as being in an offline tier and delivered to the application via an S3 interface. Because the application remains oblivious to the inner workings of the tape system, the tape system is able to provide more advanced capabilities such as multi-copy, remote tape management, and remastering. Several S3 programs might use tape with no changes if a tape system supported this interface. There will be future products with this capabilities; one is expected to debut in 2021.
Can you tell us what technologies of the future have a shot of becoming mainstream?
The storage business is valued at $50 billion annually, making it an attractive target for new technology venture capital. Many of these initiatives have pledged substantial enhancements to one or more of the four primary characteristics of storage: price (per capacity), response time (latency), bandwidth (throughput), and durability. For the record, just a fraction of the total VC investment over the past two decades has gone into developing low-level storage devices; the rest has gone toward developing storage systems that include existing storage devices into their solution. These innovations fit better with the venture capital market because they are software-based and need less money to get into production. They have a shorter time to market and reduced risk because they don’t require any new discoveries in materials science, light science, or quantum physics.
The academic and government sectors, as well as the venture market, finance much of the foundational research and advanced development of game-changing storage technologies. For instance, it was recently announced that a single pane of glass or a single quartz crystal can store 360 terabytes of data indefinitely. Although while holographic data storage has long been more hype than reality, researchers are still hard at work perfecting the technology. Data storage using DNA is being investigated by yet another group, and just recently, a company was awarded $40 million for the concept of storing data by constantly bouncing the data between space satellites in low earth orbit.
Quantum advancements have allowed for the storage of information by manipulating the “spin” of electrons. It’s hard to imagine that any of these or similar projects are currently mature enough to meaningfully impact the digital universe until at least 2030, despite the fact that they have the potential to revolutionize data storage. Several storage systems have showed promise in the past as prototypes, but have failed to transition into commercially viable products that can compete with the cost, ruggedness, performance, and most critically, reliability of the technologies currently available. Several of these technologies may find it simpler to reach consumers with the rise of cloud service providers.
What considerations lead businesses to choose cloud or on-premises storage, and what trends do you foresee in this area?
Cloud(opens in new tab) providers have been talking about the impending arrival of new hybrid systems (essentially hybrid perpetual storage) that will enable the use of cloud and/or on-premise processing capabilities while guaranteeing the long-term retention of both the raw data and the refined data of that processing, irrespective of where the processing takes place. Project Tier and Perpetual Tier are the two types of data storage. Wherever the data is being actively processed, whether in the cloud or locally, is where the project’s storage will be located. New generations of storage solutions give businesses the freedom to decide whether the Perpetual Tier (containing inactive data) should be hosted in the cloud or on-premise, regardless of the location of the Project Tier.
The first thing a business must do when choosing the localization of the Project and Perpetual Tiers is to decide whether or not to do the processing in the cloud or on-premises. There are a number of considerations that must be made before making this choice, including the overall cost of ownership, the flexibility each option offers, and the company’s preference for capital or operating expenditures. There are a number of questions that businesses should consider when deciding whether to implement a Perpetual Tier solution in the cloud or on-premises.
- How much information will be kept?
- For how long must the information be kept?
- The third and most important question is: how often and how much data would need to be restored?
- In what timeframe is data restoration required?
- How invested will my company be in a single cloud provider over the long haul? Do we have the infrastructure and manpower to manage an on-premises solution?
The location of the Perpetual Tier must be determined once the decision to process in the cloud, on-premise, or a hybrid environment has been made. The data for a cloud-based project must exist in the cloud provider’s shared online storage area in order for the project to run.
Customers would benefit most from having the choice to run the Project Tier locally or in the cloud, while still requiring an on-premises Perpetual storage system. This calls for cutting-edge data storage technology. Imagine a future on-premise storage system that, upon getting all the raw data instead of sending it to the cloud, would do two things. In order to process the data in the cloud, it would first “sync” the data to the cloud, and then make an archive copy of the data on on-premise disk or tape. When processing is complete, either the customer or the system can erase the information from the cloud according to a predetermined schedule.
Can you tell us what technologies of the future have a shot of becoming mainstream?
The storage business is valued at $50 billion annually, making it an attractive target for new technology venture capital. Many of these initiatives have pledged substantial enhancements to one or more of the four primary characteristics of storage: price (per capacity), response time (latency), bandwidth (throughput), and durability. For the record, just a fraction of the total VC investment over the past two decades has gone into developing low-level storage devices; the rest has gone toward developing storage systems that include existing storage devices into their solution. These innovations fit better with the venture capital market because they are software-based and need less money to get into production. They have a shorter time to market and reduced risk because they don’t require any new discoveries in materials science, light science, or quantum physics.
The academic and government sectors, as well as the venture market, finance much of the foundational research and advanced development of game-changing storage technologies. For instance, it was recently announced that a single pane of glass or a single quartz crystal can store 360 terabytes of data indefinitely. Although while holographic data storage has long been more hype than reality, researchers are still hard at work perfecting the technology. Data storage using DNA is being investigated by yet another group, and just recently, a company was awarded $40 million for the concept of storing data by constantly bouncing the data between space satellites in low earth orbit.
Quantum advancements have allowed for the storage of information by manipulating the “spin” of electrons. It’s hard to imagine that any of these or similar projects are currently mature enough to meaningfully impact the digital universe until at least 2030, despite the fact that they have the potential to revolutionize data storage. Several storage systems have showed promise in the past as prototypes, but have failed to transition into commercially viable products that can compete with the cost, ruggedness, performance, and most critically, reliability of the technologies currently available. Several of these technologies may find it simpler to reach consumers with the rise of cloud service providers.
Meet Khurram Raheel Akbar our senior content writer. With over 10 years of experience in the field of content writing, Raheel, has established himself as an expert in creating engaging and informative content. His exceptional writing skills have enabled him to craft compelling stories that resonate with audiences across a variety of industries. Raheel’s writing style is concise, clear, and impactful, making him a go-to writer for any business looking to enhance its brand’s online presence. His dedication to staying up-to-date with the latest trends and strategies in content marketing allows him to create relevant and informative content that drives traffic and increases conversion rates. Raheel’s passion for writing is matched only by his commitment to delivering exceptional results to his clients. Whether you’re looking to revamp your website, launch a new product, or establish your brand’s voice, Raheel is the senior content writer you need to bring your vision to life.