Data

Open Source Software

Open-source Software (OSS)

Open-source software often referred to as (OSS), is a type of computer software in which source code is released under a license. The copyright holder of the software grants users the rights to use, study, change and distribute the software as they choose. Originating from the context of software development, the term open-source describes something people can modify and share because its design is publicly accessible. Nowadays, “open-source” indicates a wider set of values known as “the open-source way.” Open-source projects or initiatives support and observe standards of open exchange, mutual contribution, transparency, and community-oriented development.

What is the source code of OSS?

The source code associated with open-source software is the part of the software that most users don’t ever see. The source code refers to the code that the computer programmers can modify to change how the software works. Programmers who have access to the source code can develop that program by adding features to it or fix bugs that don’t allow the software to work correctly.

If you’re going to use OSS, you may want to consider also using a VPN. Here are our top picks for VPNs in 2021.

Examples of Open-source Software

For the software to be considered open-source, its source code must be freely available to its users. This allows its users the ability to modify it and distribute their versions of the program. The users also have the power to give out as many copies of the original program as they want. Anyone can use the program for any purpose; there are no licensing fees or other restrictions on the software. 

Linux is a great example of an open-source operating system. Anyone can download Linux, create as many copies as they want, and offer them to friends. Linux can be installed on an infinite number of computers. Users with more knowledge of program development can download the source code for Linux and modify it, creating their customized version of that program. 

Below is a list of the top 10 open-source software programs available in 2021.

  1. LibreOffice
  2. VLC Media Player
  3. GIMP
  4. Shotcut
  5. Brave
  6. Audacity
  7. KeePass
  8. Thunderbird
  9. FileZilla
  10. Linux

Setting up Linux on a server? Find the best server for your needs with our top 5.

Advantages and Disadvantages of Open-source Software

Similar to any other software on the market, open-source software has its pros and cons. Open-source software is typically easier to get than proprietary software, resulting in increased use. It has also helped to build developer loyalty as developers feel empowered and have a sense of ownership of the end product. 

Open-source software is usually a more flexible technology, quicker to innovation, and more reliable due to the thousands of independent programmers testing and fixing bugs of the software on a 24/7 basis. It is said to be more flexible because modular systems allow programmers to build custom interfaces or add new abilities to them. The quicker innovation of open-source programs is the result of teamwork among a large number of different programmers. Furthermore, open-source is not reliant on the company or author that originally created it. Even if the company fails, the code continues to exist and be developed by its users. 

Also, lower costs of marketing and logistical services are needed for open-source software. It is a great tool to boost a company’s image, including its commercial products. The OSS development approach has helped produce reliable, high-quality software quickly and at a bargain price. A 2008 report by the Standish Group stated that the adoption of open-source software models has resulted in savings of about $60 billion per year for consumers. 

On the flip side, an open-source software development process may lack well-defined stages that are usually needed. These stages include system testing and documentation, both of which may be ignored. Skipping these stages has mainly been true for small projects. Larger projects are known to define and impose at least some of the stages as they are a necessity of teamwork. 

Not all OSS projects have been successful either. For example, SourceXchange and Eazel both failed miserably. It is also difficult to create a financially strong business model around the open-source concept. Only technical requirements may be satisfied and not the ones needed for market profitability. Regarding security, open-source may allow hackers to know about the weaknesses or gaps of the software more easily than closed source software. 

Benefits for Users of OSS

The most obvious benefit of open-source software is that it can be used for free. Let’s use the example of Linux above. Unlike Windows, users can install or distribute as many copies of Linux as they want, with limitations. Installing Linux for free can be especially useful for servers. If a user wants to set up a virtualized cluster of servers, they can easily duplicate a single Linux server. They don’t have to worry about licensing and how many requests of Linux they’re authorized to operate.

An open-source program is also more flexible, allowing users to modify their own version to an interface that works for them. When a Linux desktop introduces a new desktop interface that some users aren’t supporters of, they can modify it to their liking. Open-source software also allows developers to “be their own creator” and design their software. Did you know that Witness Android and Chrome OS, are operating systems built on Linux and other open-source software? The core of Apple’s OS X was built on open-source code, too. When users can manipulate the source code and develop software tailored to their needs, the possibilities are truly endless.

The Best Way to Prepare for a Data Center Take Out and Decommissioning

Whether your organization plans on relocating, upgrading, or migrating to cloud, data center take outs and decommissioning is no easy feat. There are countless ways that something could go wrong if attempting such a daunting task on your own. Partnering with an IT equipment specialist that knows the ins and outs of data center infrastructure is the best way to go. Since 1965, our highly experienced team of equipment experts, project managers, IT asset professionals, and support staff have handled numerous successful data center projects in every major US market. From a single server rack to a warehouse sized data center consisting of thousands of IT assets, we can handle your data center needs. We have the technical and logistical capabilities for data center take outs and decommissions. We deal with IT assets of multiple sizes, ranging from a single rack to a data center with thousands of racks and other equipment. Regardless of the requirements you’re facing, we can design a complete end-to-end solution to fit your specific needs.

 

Learn more about the data center services we offer

 

But that’s enough about us. We wrote this article to help YOU. We put together a step by step guide on how to prepare your data center to be removed completely or simply retire the assets it holds. Like always, we are here to help every step of the way.

Make a Plan

Create a list of goals you wish to achieve with your take out or decommissioning project.  Make an outline of expected outcomes or milestones with expected times of completion. These will keep you on task and make sure you’re staying on course. Appoint a project manager to oversee the project from start to finish. Most importantly, ensure backup systems are working correctly so there is not a loss of data along the way.

 

Make a List

Be sure to make an itemized list of all hardware and software equipment that will be involved with the decommissioning project or data center take out. Make sure nothing is disregarded and check twice with a physical review. Once all of the equipment in your data center is itemized, build a complete inventory of assets including hardware items such as servers, racks, networking gear, firewalls, storage, routers, switches, and even HVAC equipment. Collect all software licenses and virtualization hardware involved and keep all software licenses associated with servers and networking equipment. 

 

Partner with an ITAD Vendor

Partnering with an experienced IT Asset Disposition (ITAD) vendor can save you a tremendous amount of time and stress. An ITAD vendor can help with the implementation plan listing roles, responsibilities, and activities to be performed within the project. Along with the previous steps mentioned above, they can assist in preparing tracking numbers for each asset earmarked for decommissioning, and cancel maintenance contracts for equipment needing to be retired. 

Learn more about our ITAD process

 

Get the Required Tools

Before you purchase or rent any tools or heavy machinery, it is best to make a list of the tools, materials, and labor hours you will need to complete this massive undertaking. Some examples of tools and materials that might be necessary include forklifts, hoists, device shredders, degaussers, pallets, packing foam, hand tools, labels, boxes, and crates. Calculate the number of man hours needed to get the job done. Try to be as specific as possible about what the job requires at each stage. If outside resources are needed, make sure to perform the necessary background and security checks ahead of time. After all, it is your data at stake here.

 

Always Think Data Security

When the time comes to start the data center decommissioning or take out project, review your equipment checklist, and verify al of your data has been backed up, before powering down and disconnecting any equipment. Be sure to tag and map cables for easier set up and transporting, record serial numbers, and tag all hardware assets. For any equipment that will be transported off-site, data erasure may be necessary if it will not be used anymore. When transporting data offsite, make sure a logistics plan is in place. A certified and experienced ITAD partner will most likely offer certificates of data destruction and chain of custody during the entire process. They may also advise you in erasing, degaussing, shredding, or preparing for recycling each piece of equipment as itemized.

Learn more about the importance of data security

 

Post Takeout and Decommission

Once the data center take out and decommission project is complete, the packing can start. Make sure you have a dedicated space for packing assets. If any equipment is allocated for reuse within the company, follow the appropriate handoff procedure. For assets intended for refurbishing or recycling, pack and label for the intended recipients. If not using an ITAD vendor, be sure to use IT asset management software to track all stages of the process.

Microsoft’s Project Natick: The Underwater Data Center of the Future

When you think of underwater, deep-sea adventures, what is something that comes to mind? Colorful plants, odd looking sea creatures, and maybe even a shipwreck or two; but what about a data center? Moving forward, under-water datacenters may become the norm, and not so much an anomaly. Back in 2018, Microsoft sunk an entire data center to the bottom of the Scottish sea, plummeting 864 servers and 27.6 petabytes of storage. After two years of sitting 117 feet deep in the ocean, Microsoft’s Project Natick as it’s known, has been brought to the surface and deemed a success.

What is Project Natick?

 

Microsoft’s Project Natick was thought up back in 2015 when the idea of submerged servers could have a significant impact on lowering energy usage. When the original hypothesis came to light, Microsoft it immersed a data center off the coast of California for several months as a proof of concept to see if the computers would even endure the underwater journey. Ultimately, the experiment was envisioned to show that portable, flexible data center placements in coastal areas around the world could prove to scale up data center needs while keeping energy and operation costs low. Doing this would allow companies to utilize smaller data centers closer to where customers need them, instead of routing everything to centralized hubs. Next, the company will look into the possibilities of increasing the size and performance of these data centers by connecting more than one together to merge their resources.

What We Learned from Microsoft’s Undersea Experiment

After two years of being submerged, the results of the experiment not only showed that using offshore underwater data centers appears to work well in regards to overall performance, but also discovered that the servers contained within the data center proved to be up to eight times more reliable than their above ground equivalents. The team of researchers plan to further examine this phenomenon and exactly what was responsible for this greater reliability rate. For now, steady temperatures, no oxygen corrosion, and a lack of humans bumping into the computers is thought to be the reason. Hopefully, this same outcome can be transposed to land-based server farms for increased performance and efficiency across the board.

Additional developments consisted of being able to operate with more power efficiency, especially in regions where the grid on land is not considered reliable enough for sustained operation. It also will take lessons on renewability from the project’s successful deployment, with Natick relying on wind, solar, and experimental tidal technologies. As for future underwater servers, Microsoft acknowledged that the project is still in the infant stages. However, if it were to build a data center with the same capabilities as a standard Microsoft Azure it would require multiple vessels.

Do your data centers need servicing?

The Benefits of Submersible Data Centers

 

The benefits of using a natural cooling agent instead of energy to cool a data center is an obvious positive outcome from the experiment. When Microsoft hauled its underwater data center up from the bottom of the North Sea and conducted some analysis, researchers also found the servers were eight time more reliable than those on land.

The shipping container sized pod that was recently pulled from 117 feet below the North Sea off Scotland’s Orkney Islands was deployed in June 2018. Throughout the last two years, researchers observed the performance of 864 standard Microsoft data center servers installed on 12 racks inside the pod. During the experiment they also learned more about the economics of modular undersea data centers, which have the ability to be quickly set up offshore nearby population centers and need less resources for efficient operations and cooling. 

Natick researchers assume that the servers benefited from the pod’s nitrogen atmosphere, being less corrosive than oxygen. The non-existence of human interaction to disrupt components also likely added to increased reliability.

The North Sea-based project also exhibited the possibility of leveraging green technologies for data center operations. The data center was connected to the local electric grid, which is 100% supplied by wind, solar and experimental energy technologies. In the future, Microsoft plans to explore eliminating the grid connection altogether by co-locating a data center with an ocean-based green power system, such as offshore wind or tidal turbines.

Snowflake IPO

On September 16, 2020, history was made on the New York Stock Exchange. A software company named Snowflake (ticker: SNOW) made its IPO as the largest publicly traded software company, ever. As one of the most hotly anticipated listing in 2020, Snowflake began publicly trading at $120 per share and almost immediately jumped to $300 per share within a matter of minutes. With the never before seen hike in price, Snowflake also became the largest company to ever double in value on its first day of trading, ending with a value of almost $75 billion. 

What is Snowflake?

So, what exactly does Snowflake do? What is it that makes a billionaire investors like Warren Buffet and Marc Benioff jump all over a newly traded software company? It must be something special right? With all the speculation surrounding the IPO, it’s worth explaining what the company does. A simple explanation would be that Snowflake helps companies store their data in the cloud, rather than in on-site facilities. Traditionally, a company’s data is been stored on-premises on physical servers managed by that company. Tech giants like Oracle and IBM have led the industry for decades. Well, Snowflake is profoundly different. Instead of helping company’s store their data on-premises, Snowflake facilitates the warehousing of data in the cloud. But that’s not all. Snowflake has the capabilities of making the data queryable, meaning it simplifies the process for businesses looking to pull insights from the stored data. This is what sets Snowflake apart from the other data hoarding behemoths of the IT world. Snowflake discovered the secret to separating data storage from the act of computing the data. The best part is that they’ve done this before any of the other big players like Google, Amazon, or Microsoft. Snowflake is here to stay. 

Snowflake’s Leadership

Different than Silicon Valley’s tech unicorns of the past, Snowflake was started in 2012 by three data base engineers. Backed by venture capitalists and one VC firm that wishes to remain anonymous, Snowflake is currently led by software veteran, Frank Slootman. Before taking the reigns at Snowflake, Slootman had great success leading Data Domain and Service Now. He grew Data Domain from just a twenty-employee startup venture to over $1 billion in sales and a $2.4 billion acquisition sale to EMC. I think it’s safe to say that Snowflake is in the right hands, especially if it has any hopes of maturing into its valuation.

Snowflake’s Product Offering

We all know that Snowflake isn’t the only managed data warehouse in the industry. Both Amazon Web Service’s (AWS) Redshift and Google Cloud Platform’s (GCP) BigQuery are very common alternatives. So there had to be something that set Snowflake apart from the competition. It’s a combination of flexibility, service, and user interface. With a database like Snowflake, two pieces of infrastructure are driving the revenue model: storage and computing. Snowflake takes the responsibility of storing the data as well as ensuring the data queries run fast and smooth. The idea of splitting storage and computing in a data warehouse was unusual when Snowflake launched in 2012. Currently, there are query engines like Presto that solely exist just to run queries with no storage included. Snowflake offers the advantages of splitting storage and queries: stored data is located remotely on the cloud, saving local resources for the load of computing data. Moving storage to the cloud delivers lower cost, has higher availability, and provides greater scalability.  

 

Multiple Vendor Options

A majority of companies have adopted a multi-cloud as they prefer not to be tied down to a single cloud provider.  There’s a natural hesitancy to choose options like BigQuery that are subject to a single cloud like Google. Snowflake offers a different type of flexibility, operating on AWS, Azure, or GCP, satisfying the multi-cloud wishes of CIOs. With tech giants battling for domination of the cloud, Snowflake is in a sense the Switzerland of data warehousing. 

Learn more about a multi-cloud approach

Top of Form

Bottom of Form

 

Snowflake as a Service

When considering building a data warehouse, you need to take into account the management of the infrastructure itself. Even when farming out servers to a cloud provider, decisions like the right size storage, scaling to growth, and networking hardware come into play. Snowflake is a fully managed service. This means that users don’t need to worry about building any infrastructure at all. Just put your data into the system and query it. Simple as that. 

While fully managed services sound great, it comes at a cost. Snowflake users need to be deliberate about storing and querying their data as fully managed services are pricey. If deciding whether to build or buy your data warehouse, it would be wise to compare Snowflake ownership’s total cost to building something themselves.

 

Snowflake’s User Interface and SQL Functionality

Snowflake’s UI for querying and exploring tables is as easy on the eyes as it to use. Their SQL functionality is also a strong touching point. (Structured Query Language) is the programming language that developers and data scientists use to query their databases. Each database has slightly different details, wording, and structure. Snowflake’s SQL seems to have collected the best from all of the database languages and added other useful functions. 

 

A Battle Among Tech Giants

As the proverb goes, competition creates reason for caution. Snowflake is rubbing shoulders with some of the world’s largest companies, including Amazon, Google, and Microsoft. While Snowflake has benefited from an innovative market advantage, the Big Three are catching up quickly by creating comparable platforms.

However, Snowflake is dependent on these competitors for data storage. They’ve only has managed to thrive by acting as “Switzerland”, so customers don’t have to use just one cloud provider. As more competition enters the “multicloud” service industry, nonalignment can be an advantage, but not always be possible. Snowflake’s market share is vulnerable as there are no clear barriers to entry for the industry giants, given their technical talent and size. 

Snowflake is just an infant in the public eye and we will see if it sinks or swims over the next year or so. But with brilliant leadership, a promising market, and an extraordinary track record, Snowflake may be much more than a one hit wonder. Snowflake may be a once in a lifetime business.

DTC – A True Partnership

For Over Half of a Century We’ve Been Committed to Serving IT Departments and Saving IT Budgets 

 

Our Story

In 1965, we opened our doors for business with the idea to transform the IT equipment industry through technology, transparency, standards, and processes. We planted our roots as a round reel tape company in Downey, CA. As a family owned and operated business over the past 50 years, we have sprouted into one of the most trustworthy, reliable, and authoritative organizations in the industry. 

From disk pack tape storage and round reel tape to hard drives, networked storage, tape libraries, and cloud backup systems; our business and partnerships continue to prosper and grow with the constantly innovative IT industry. DTC proudly works with all organizations, letting our reputation speak for itself.

DTC’s 3 Point Message is Simple:

 

  • Our goal is to reach 100% Recyclability of old storage media and IT assets.

 

Electronics recycling is our bread and butter. We’ve been both saving the environment and companies money, by setting the standard for secure handling and re purposing of used and obsolete electronics. Recycling of electronics and IT equipment is an essential part of a company’s waste management strategy. If you are looking for a safe and secure way of electronics recycling, then you should consider our proven services. We specialize in ethical disposal and reprocessing of used and obsolete electronics and computer equipment. We can help accomplish legal and conservational goals as a responsible organization. Let us be the solution to your problem and help your organization stay socially responsible. 

 

Learn more about recycling your old IT assets

 

  • Our pledge since day one has been to keep your data safe.

 

Data security is main concern for IT departments in any organization, and rightly so. Many of our partners demand that their data is handled appropriately and destroyed according to both government and industry standards. DTC provides honest and secure data destruction services which include physical destruction with a mobile shredder and secure data erasure methods like degaussing. All of our destruction services are effective, auditable, and certified. Ship storage assets to our secured facility or simply ask for the mobile data destroyer to be deployed on site. With over 50 years of service, we’ve never had one data leak. Now that’s experience you can trust!

Learn more about DTC data security

 

  • Our process will help you save time and money.

 

Our IT asset disposition (ITAD) process will help your organization recoup dollars from your surplus, used IT Assets and free up storage space at your facility. Our equipment buyback program is dedicated to purchasing all types of surplus and used data storage and IT equipment. We use the highest standards to ensure you get the greatest return your initial IT investment. With the current pace of hardware evolution, most companies are upgrading their systems every two years. This leads to a lot of surplus IT equipment. DTC has the experience and resources to get you the most for your old IT assets.

Get the most return on your IT investment 

The Value We Provide

DTC’s diverse knowledge-base and experiences, allow our partners to utilize our purchasing and sales personnel as a valued resource for questions, research, and answers. Our vast database and the contact list of customers, resellers, recyclers, suppliers, and industry partners allows us to excellent pricing when sourcing your IT Equipment. Don’t believe us? Let us know what you need, and we will find it for you. 

How we can help you?

Here is brief list of services we provide:

 

Ready to work with a trusted partner? Contact Us Today



The TikTok Controversy: How Much Does Big Tech Care About Your Data and its Privacy?

If you have a teenager in your house, you’ve probably encountered them making weird dance videos in front of their phone’s camera. Welcome to the TikTok movement that’s taking over our nation’s youth. TikTok is a popular social media video sharing app that continues to make headlines due to cybersecurity concerns. Recently, the U.S. military banned its use on government phones following a warning from the DoD about potential personal information risk. TikTok has now verified that it patched multiple vulnerabilities that exposed user data. In order to better understand TikTok’s true impact on data and data privacy, we’ve compiled some of the details regarding the information TikTok gathers, sends, and stores.

What is TikTok?

TikTok is a video sharing application that allows users to create short, fifteen-second videos on their phones and post the content to a public platform. Videos can be enriched with music and visual elements, such as filters and stickers. By having a young adolescent demographic, along with the content that is created and shared on the platform, have put the app’s privacy features in the limelight as of late. Even more so, questions the location of TikTok data storage and access have raised red flags.

You can review TikTok’s privacy statement for yourself here.

TikTok Security Concerns

Even though TikTok allows users to control who can see their content, the app does ask for a number of consents on your device. Most noteworthy, it accesses your location and device information. However, there’s no evidence to support the theory of malicious activity or that TikTok is violating their privacy policy, it is still advised to practice caution with the content that’s both created and posted.

The biggest concern surrounding the TikTok application is where user information is stored and who has access to it. According the TikTok website, “We store all US user data in the United States, with backup redundancy in Singapore. Our data centers are located entirely outside of China, and none of our data is subject to Chinese law.” “The personal data that we collect from you will be transferred to, and stored at, a destination outside of the European Economic Area (“EEA”).” There is no other specific information regarding where user data is stored.

Recently, TikTok published a Transparency Report which lists “legal requests for user information”, “government requests for content removal”, and “copyrighted content take-down notices”. The “Legal Requests for User Information” shows that India, the United States, and Japan are the top three countries where user information was requested. The United States was the number one country with fulfilled request (86%) and number of accounts specified in the requests (255). Oddly enough, China is not listed as having received any requests for user information. 

What Kind of Data is TikTok Tracking?

Below are some of the consents TikTok requires on Android and iOS devices after installation of the app is completed. While some of the permissions are to be expected, these are all consistent with TikTok’s written privacy policy. When viewing all that TikTok gathers from its users, it can be alarming. In short, the app allows TikTok to:

  • Access the camera (and take pictures/video), the microphone (and record sound), the device’s WIFI connection, and the full list of contacts on your device.
  • Determine if the internet is available and access it if it is.
  • Keep the device turned on and automatically start itself.
  • Secure detailed information on the user’s location using GPS.
  • Read and write to the device’s storage, install/remove shortcuts, and access the flashlight (turn it off and on).

You read that right, TikTok has full access to your audio, video, and list of contacts in your phone. The geo location tracking via GPS is somewhat surprising though, especially since TikTok videos don’t display location information. So why collect that information? If you operate and Android device, TikTok has the capability of accessing other apps running at the same time, which can give the app access to data in another app such as a banking or password storage app. 

Why is TikTok Banned by the US Military?

In December 2019, the US military started instructing soldiers to stop using TikToK on all government-owned phones. This TikTok policy reversal came just shortly after the release of a Dec. 16 Defense Department Cyber Awareness Message classifying TikTok as having potential security risks associated with its use. As the US military cannot prevent government personnel from accessing TiKTok on their personal phones, the leaders recommended that service members use caution if unfamiliar text messages are received.

In fact, this was not the first time that the Defense Department had been required to encourage service members to remove a popular app from their phones. In 2016, the Defense Department banned the augmented-reality game, Pokémon Go, from US military owned smartphones. However, this case was a bit different as military officials alluded to concerns over productivity and the potential distractions it could cause. The concerns over TikTok are focused on cybersecurity and spying by the Chinese government.

In the past, the DoD has put out more general social media guidelines, advising personnel to proceed with caution when using any social platform. And all DoD personnel are required to take annual cyber awareness training that covers the threats that social media can pose.

LTO-9 Tape Technology (Pre-Purchase Program)

LTO-9 Tape Technology (Pre-Purchase Program)

Our LTO-9 Pre-Purchase Program allows anyone to pre-order LTO-9 tape technology before it is available. This is the ninth generation of tape technology that delivers on the promise made by the LTO Consortium to develop LTO tape technology through at least 12 generations. In an endeavor to deliver our customers the latest technology on the market, we are offering pre orders of LTO-9 tape technology. This gives our customers the best opportunity to receive the latest generation of LTO tape as soon as it’s available. LTO-9 is expected to be available in Fall 2020.

 How to Buy: CLICK HERE | or call us today @ 1-800-700-7683.

How to Sell: For those looking to sell your old data tapes prior to upgrading to LTO-9, CLICK HERE to submit your inventory and we will contact you back within 24 Hours.


LTO TECHNOLOGY FOR LONG-TERM DATA PROTECTION

LTO tape technology provides organizations with reliable, long-term data protection and preservation. With LTO tape drives, organizations can meet security and compliance requirements, while at the same time, save on storage footprint, power, and cooling costs, which can make a significant difference in operating costs for larger library environments.

LTO-9 FEATURED HIGHLIGHTS

  • Lowest cost per GB.

  • Tape offers lower power and cooling costs, plus a lower footprint leads to improved TCO.

  • Linear Tape File System (LTFS) support.

  • AES 256-bit Encryption – Military-grade encryption comes standard.

  • WORM technology – Makes data non-rewriteable and non-erasable, which acts as an immutable vault within your tape library to secure and protect an offline copy from ransomware.

LTO-9 vs. LTO-8

LTO-9 (Linear Tape-Open 9) is the most recently released tape format from the Linear Tape-Open Consortium, following the LTO-8 format which launched in 2017. LTO-9 is expected to double the capacity of LTO-8 to 60 TB compressed. LTO-8 provides 30 TB of compressed storage capacity and 12 TB of uncompressed capacity, doubling what LTO-7 offered.

Although, the LTO Consortium has not announced the data transfer rate for LTO-9 yet, LTO-8 features an uncompressed data transfer rate of up to 360 MBps and a compressed data transfer rate of up to 750 MBps. 

LTO-9 has a similar structure to LTO-8 in that tape drives are backward-compatible with one generation. Essentially, the LTO-8 tapes can read and write to LTO-7 tapes. LTO had typically been able to read back two generations and write back one generation. However, in LTO-8 the backward reading compatibility is limited to one generation. 

LTO-9 also features the same WORM, LTFS, and 256-bitencryption technology as the prior generation LTO-8.

Uses for LTO-9

LTO features high capacity, durability, and portability for a comparatively low cost. Archived data storage is not normally needed on an immediate basis, making tape a solid backup option. More commonly, backup data is used for restores in the event of an incident or data loss.

LTO-9 tapes housed at an off-site location are a fantastic option for disaster recovery. If an organizations main data hub has an incident, they can use the durable LTO9 tapes to recover their data. According to the LTO consortium, once data becomes less frequently retrieved, it should be migrated to tape. 

Tape is particularly useful in industries such as entertainment and healthcare that generate large volumes of data every day and require a long-term data storage option that’s less expensive than disk. As ransomware attacks stay in the headlines, tape provides an offline backup storage option immune to a cyber-attack. Data stored on an LTO-9 tape cartridge does not have to be connected to the network. This creates what is called an Airgap and creates a safety net from a cyberattack.

Pros and Cons of LTO-9 Tape

Tape capacity continues to expand. When LTO-9 launches, it will have enhanced the compressed capacity of the LTO tape products by almost 60 TB in roughly 10 years. As data levels continue to grow rapidly for many groups, capacity is one of the most important aspects of data storage media. Even the cost of tape is low compared to storing 60 TB on other storage media such as disk or flash. Particularly when taking energy and equipment into consideration as a constant energy source is not required to keep data stored on tape.

Other advantages of LTO-9 tape include:

  • A reliable generational roadmap that allows customers to count on a new product every few years, and a capacity that is not far off from the original estimate.

  • 256-bit encryption that guarantees security during storage and shipment. Its offline nature also serves as protection from ransomware and cyberattacks, creating an airgap.

  • A reputation of being extremely reliable, with a lifespan of roughly 30 years. The tape format is also portable, making it remarkably easy to transport.

LTO’s open format also allows customers to access multiple, compatible products. The open format offers intellectual property licenses to prospective manufacturers, leading to innovation and improvements. However, LTO products are not compatible with non-LTO products.

Depending on the amount of data you need to store, cloud storage can be less expensive than tape. In some instances, cloud backup providers provide a free option up to a specified volume of data. Cloud also offers random access, unlike tape. But restoration of data files can be slow depending on data volume and bandwidth.

DTC – Clients

DTC works with some of the biggest names in #business! We’re here to help. Give our sales team a call today and get your #data on the right track! P: 1-800-700-7683

 

#Fortune500 #Software #Sports #Food #Beverage #Hospitality #Entertainment #Healthcare #Retail #Education #Energy #Development

Using IT to Help First Responders Save Lives

Using IT to Help First Responders Save Lives

Imagine sitting in rush hour traffic on Friday afternoon and you see an ambulance approaching in your rear-view mirror with it’s lights flashing. Surely you assume there must be an accident ahead, but what if it were a relative on their way to the hospital?

The question you ask yourself is, “how is there not a better way?” With all of the emerging technology these days, there certainly has to be something to help those who need it most.

Low and behold smart cities. Smart cities are the trend of the future, and the technologies that empower them are likely to become a $135 billion market by 2021.

For first responders, the likelihood of smart traffic lights is a pleasant change. By operating with GPS technology in emergency response vehicles, smart traffic lights can help first responders avoid traffic jams and significantly reduce response times.

Even better is the sensors that can check the structural integrity of buildings, bridges, and roads can increase safety by identifying problems before they cause an accident. Such preventative maintenance can help cities avoid the costs associated with minor injuries to major and fatal accidents.

What could go wrong?

Strategically placed sensors have the potential to improve safety in a multitude of ways. However, city officials are justly concerned that the massive amounts of data collected might not be useful as well as overburdening current systems to their limit.

There are two main obstacles standing between city officials and smart city adoption. The first problem is the issue of integrating new technologies within existing systems, and the second problem is figuring out how to ensure the implemented sensors collect beneficial data.

The Apple Watch is terrific example of how technology can be both helpful and harmful. The ability of the Apple Watch to distinguish between a “fall” and a “drop” could be more than the health-care system bargained for. One could say that the technology has the potential to save lives, especially the elderly.

On the other hand, in the chance of a malfunction, the sensors could create an excessive number of 911 calls when they aren’t actually needed. With possibly millions of the devices in a densely populated city, it’s easy to see how the issue could escalate consume emergency call centers with false alarms.

IoT advantages

In spite of the complexities with integration, the cities that do transition to smart cities stand to benefit greatly. A network of connected sensors and devices can reduce the severity of accidents or eliminate them entirely. For instance, Tesla has installed sensors that intelligently avoid impacting other cars.

Recently the city of Corona, CA migrated to a smart city. They’ve implemented sensors can also provide an incredibly rich picture of what’s happening. Many of the most revolutionary technologies have yet to be invented, but the data gathered by these tools is already helping city officials use their resources more effectively.

For example, officers can distribute Amber Alert information to an entire population, and apps like Waze show transportation officials valuable traffic data so they can reduce bottlenecks. A smart watch might be able to give paramedics vitals of their patients before they even arrive on the scene. No matter the city, smart tech has the potential to improve safety, efficiency and quality of life for residents.

Scroll to top