Data

NHL Partners with AWS (Amazon) for Cloud Infrastructure

NHL Powered by AWS

“Do you believe in miracles? Yes!” This was ABC sportscaster Al Michaels’ quote “heard ’round the world” after the U.S. National Team beat the Soviet National Team at the 1980 Lake Placid Winter Olympic Games to advance to the medal round. One of the greatest sports moments ever that lives in infamy among hockey fans is readily available for all of us to enjoy as many times as we want thanks to modern technology. Now the National Hockey League (NHL) is expanding their reach with technology as they announced a partnership with Amazon Web Services (AWS). AWS will become the official cloud storage partner of the league, making sure all historical moments like the Miracle on Ice are never forgotten.

The NHL will rely on AWS exclusively in the areas of artificial intelligence and machine learning as they look to automate video processing and content delivery in the cloud. AWS will also allow them to control the Puck and Player Tracking (PPT) System to better capture the details of gameplay. Hockey fans everywhere are in for a treat!

What is the PPT System?

The NHL has been working on developing the PPT system since 2013. Once it is installed in every team’s arena in the league, the innovative system will require several antennas in the rafters of the arenas, tracking sensors placed on every player in the game, and tracking sensors built into the hockey pucks. The hockey puck sensors can be tracked up to 2,000 times per second to yield a set of coordinates that can then turn into new results and analytics.

The Puck Stops Here! Learn how the NHL’s L.A. Kings use LTO Tape to build their archive.

How Will AWS Change the Game?

AWS’s state-of-the-art technology and services will provide us with capabilities to deliver analytics and insights that highlight the speed and skill of our game to drive deeper fan engagement. For example, a hockey fan in Russia could receive additional stats and camera angles for a major Russian player. For international audiences that could be huge. Eventually, personalized feeds could be possible for viewers who would be able to mix and match various audio and visual elements. 

The NHL will also build a video platform on AWS to store video, data, and related applications into one central source that will enable easier search and retrieval of archival video footage. Live broadcasts will have instant access to NHL content and analytics for airing and licensing, ultimately enhancing broadcast experiences for every viewer. Also, Virtual Reality experiences, Augmented Reality-powered graphics, and live betting feeds are new services that can be added to video feeds.

As part of the partnership, Amazon Machine Learning Solutions will cooperate with the league to use its tech for in-game video and official NHL data. The plan is to convert the data into advanced game analytics and metrics to further engage fans. The ability for data to be collected, analyzed, and distributed as fast as possible was a key reason why the NHL has partnered with AWS.

The NHL plans to use AWS Elemental Media to develop and manage cloud-based HD and 4K video content that will provide a complete view of the game to NHL officials, coaches, players, and fans. When making a crucial game-time decision on a penalty call the referees will have multi-angle 4k video and analytics to help them make the correct call on the ice. According to Amazon Web Services, the system will encode, process, store, and transmit game footage from a series of camera angles to provide continuous video feeds that capture plays and events outside the field of view of traditional cameras.

The NHL and AWS plan to roll out the new game features slowly throughout the next coming seasons, making adjustments along the way to enhance the fan experience. As one of the oldest and toughest sports around, hockey will start to have a new sleeker look. Will all the data teams will be able to collect, we should expect a faster, stronger, more in-depth game. Do you believe in miracles? Hockey fans sure do!

Open Source Software

Open-source Software (OSS)

Open-source software often referred to as (OSS), is a type of computer software in which source code is released under a license. The copyright holder of the software grants users the rights to use, study, change and distribute the software as they choose. Originating from the context of software development, the term open-source describes something people can modify and share because its design is publicly accessible. Nowadays, “open-source” indicates a wider set of values known as “the open-source way.” Open-source projects or initiatives support and observe standards of open exchange, mutual contribution, transparency, and community-oriented development.

What is the source code of OSS?

The source code associated with open-source software is the part of the software that most users don’t ever see. The source code refers to the code that the computer programmers can modify to change how the software works. Programmers who have access to the source code can develop that program by adding features to it or fix bugs that don’t allow the software to work correctly.

If you’re going to use OSS, you may want to consider also using a VPN. Here are our top picks for VPNs in 2021.

Examples of Open-source Software

For the software to be considered open-source, its source code must be freely available to its users. This allows its users the ability to modify it and distribute their versions of the program. The users also have the power to give out as many copies of the original program as they want. Anyone can use the program for any purpose; there are no licensing fees or other restrictions on the software. 

Linux is a great example of an open-source operating system. Anyone can download Linux, create as many copies as they want, and offer them to friends. Linux can be installed on an infinite number of computers. Users with more knowledge of program development can download the source code for Linux and modify it, creating their customized version of that program. 

Below is a list of the top 10 open-source software programs available in 2021.

  1. LibreOffice
  2. VLC Media Player
  3. GIMP
  4. Shotcut
  5. Brave
  6. Audacity
  7. KeePass
  8. Thunderbird
  9. FileZilla
  10. Linux

Setting up Linux on a server? Find the best server for your needs with our top 5.

Advantages and Disadvantages of Open-source Software

Similar to any other software on the market, open-source software has its pros and cons. Open-source software is typically easier to get than proprietary software, resulting in increased use. It has also helped to build developer loyalty as developers feel empowered and have a sense of ownership of the end product. 

Open-source software is usually a more flexible technology, quicker to innovation, and more reliable due to the thousands of independent programmers testing and fixing bugs of the software on a 24/7 basis. It is said to be more flexible because modular systems allow programmers to build custom interfaces or add new abilities to them. The quicker innovation of open-source programs is the result of teamwork among a large number of different programmers. Furthermore, open-source is not reliant on the company or author that originally created it. Even if the company fails, the code continues to exist and be developed by its users. 

Also, lower costs of marketing and logistical services are needed for open-source software. It is a great tool to boost a company’s image, including its commercial products. The OSS development approach has helped produce reliable, high-quality software quickly and at a bargain price. A 2008 report by the Standish Group stated that the adoption of open-source software models has resulted in savings of about $60 billion per year for consumers. 

On the flip side, an open-source software development process may lack well-defined stages that are usually needed. These stages include system testing and documentation, both of which may be ignored. Skipping these stages has mainly been true for small projects. Larger projects are known to define and impose at least some of the stages as they are a necessity of teamwork. 

Not all OSS projects have been successful either. For example, SourceXchange and Eazel both failed miserably. It is also difficult to create a financially strong business model around the open-source concept. Only technical requirements may be satisfied and not the ones needed for market profitability. Regarding security, open-source may allow hackers to know about the weaknesses or gaps of the software more easily than closed source software. 

Benefits for Users of OSS

The most obvious benefit of open-source software is that it can be used for free. Let’s use the example of Linux above. Unlike Windows, users can install or distribute as many copies of Linux as they want, with limitations. Installing Linux for free can be especially useful for servers. If a user wants to set up a virtualized cluster of servers, they can easily duplicate a single Linux server. They don’t have to worry about licensing and how many requests of Linux they’re authorized to operate.

An open-source program is also more flexible, allowing users to modify their own version to an interface that works for them. When a Linux desktop introduces a new desktop interface that some users aren’t supporters of, they can modify it to their liking. Open-source software also allows developers to “be their own creator” and design their software. Did you know that Witness Android and Chrome OS, are operating systems built on Linux and other open-source software? The core of Apple’s OS X was built on open-source code, too. When users can manipulate the source code and develop software tailored to their needs, the possibilities are truly endless.

The Best Way to Prepare for a Data Center Take Out and Decommissioning

Whether your organization plans on relocating, upgrading, or migrating to cloud, data center take outs and decommissioning is no easy feat. There are countless ways that something could go wrong if attempting such a daunting task on your own. Partnering with an IT equipment specialist that knows the ins and outs of data center infrastructure is the best way to go. Since 1965, our highly experienced team of equipment experts, project managers, IT asset professionals, and support staff have handled numerous successful data center projects in every major US market. From a single server rack to a warehouse sized data center consisting of thousands of IT assets, we can handle your data center needs. We have the technical and logistical capabilities for data center take outs and decommissions. We deal with IT assets of multiple sizes, ranging from a single rack to a data center with thousands of racks and other equipment. Regardless of the requirements you’re facing, we can design a complete end-to-end solution to fit your specific needs.

 

Learn more about the data center services we offer

 

But that’s enough about us. We wrote this article to help YOU. We put together a step by step guide on how to prepare your data center to be removed completely or simply retire the assets it holds. Like always, we are here to help every step of the way.

Make a Plan

Create a list of goals you wish to achieve with your take out or decommissioning project.  Make an outline of expected outcomes or milestones with expected times of completion. These will keep you on task and make sure you’re staying on course. Appoint a project manager to oversee the project from start to finish. Most importantly, ensure backup systems are working correctly so there is not a loss of data along the way.

 

Make a List

Be sure to make an itemized list of all hardware and software equipment that will be involved with the decommissioning project or data center take out. Make sure nothing is disregarded and check twice with a physical review. Once all of the equipment in your data center is itemized, build a complete inventory of assets including hardware items such as servers, racks, networking gear, firewalls, storage, routers, switches, and even HVAC equipment. Collect all software licenses and virtualization hardware involved and keep all software licenses associated with servers and networking equipment. 

 

Partner with an ITAD Vendor

Partnering with an experienced IT Asset Disposition (ITAD) vendor can save you a tremendous amount of time and stress. An ITAD vendor can help with the implementation plan listing roles, responsibilities, and activities to be performed within the project. Along with the previous steps mentioned above, they can assist in preparing tracking numbers for each asset earmarked for decommissioning, and cancel maintenance contracts for equipment needing to be retired. 

Learn more about our ITAD process

 

Get the Required Tools

Before you purchase or rent any tools or heavy machinery, it is best to make a list of the tools, materials, and labor hours you will need to complete this massive undertaking. Some examples of tools and materials that might be necessary include forklifts, hoists, device shredders, degaussers, pallets, packing foam, hand tools, labels, boxes, and crates. Calculate the number of man hours needed to get the job done. Try to be as specific as possible about what the job requires at each stage. If outside resources are needed, make sure to perform the necessary background and security checks ahead of time. After all, it is your data at stake here.

 

Always Think Data Security

When the time comes to start the data center decommissioning or take out project, review your equipment checklist, and verify al of your data has been backed up, before powering down and disconnecting any equipment. Be sure to tag and map cables for easier set up and transporting, record serial numbers, and tag all hardware assets. For any equipment that will be transported off-site, data erasure may be necessary if it will not be used anymore. When transporting data offsite, make sure a logistics plan is in place. A certified and experienced ITAD partner will most likely offer certificates of data destruction and chain of custody during the entire process. They may also advise you in erasing, degaussing, shredding, or preparing for recycling each piece of equipment as itemized.

Learn more about the importance of data security

 

Post Takeout and Decommission

Once the data center take out and decommission project is complete, the packing can start. Make sure you have a dedicated space for packing assets. If any equipment is allocated for reuse within the company, follow the appropriate handoff procedure. For assets intended for refurbishing or recycling, pack and label for the intended recipients. If not using an ITAD vendor, be sure to use IT asset management software to track all stages of the process.

Microsoft’s Project Natick: The Underwater Data Center of the Future

When you think of underwater, deep-sea adventures, what is something that comes to mind? Colorful plants, odd looking sea creatures, and maybe even a shipwreck or two; but what about a data center? Moving forward, under-water datacenters may become the norm, and not so much an anomaly. Back in 2018, Microsoft sunk an entire data center to the bottom of the Scottish sea, plummeting 864 servers and 27.6 petabytes of storage. After two years of sitting 117 feet deep in the ocean, Microsoft’s Project Natick as it’s known, has been brought to the surface and deemed a success.

What is Project Natick?

 

Microsoft’s Project Natick was thought up back in 2015 when the idea of submerged servers could have a significant impact on lowering energy usage. When the original hypothesis came to light, Microsoft it immersed a data center off the coast of California for several months as a proof of concept to see if the computers would even endure the underwater journey. Ultimately, the experiment was envisioned to show that portable, flexible data center placements in coastal areas around the world could prove to scale up data center needs while keeping energy and operation costs low. Doing this would allow companies to utilize smaller data centers closer to where customers need them, instead of routing everything to centralized hubs. Next, the company will look into the possibilities of increasing the size and performance of these data centers by connecting more than one together to merge their resources.

What We Learned from Microsoft’s Undersea Experiment

After two years of being submerged, the results of the experiment not only showed that using offshore underwater data centers appears to work well in regards to overall performance, but also discovered that the servers contained within the data center proved to be up to eight times more reliable than their above ground equivalents. The team of researchers plan to further examine this phenomenon and exactly what was responsible for this greater reliability rate. For now, steady temperatures, no oxygen corrosion, and a lack of humans bumping into the computers is thought to be the reason. Hopefully, this same outcome can be transposed to land-based server farms for increased performance and efficiency across the board.

Additional developments consisted of being able to operate with more power efficiency, especially in regions where the grid on land is not considered reliable enough for sustained operation. It also will take lessons on renewability from the project’s successful deployment, with Natick relying on wind, solar, and experimental tidal technologies. As for future underwater servers, Microsoft acknowledged that the project is still in the infant stages. However, if it were to build a data center with the same capabilities as a standard Microsoft Azure it would require multiple vessels.

Do your data centers need servicing?

The Benefits of Submersible Data Centers

 

The benefits of using a natural cooling agent instead of energy to cool a data center is an obvious positive outcome from the experiment. When Microsoft hauled its underwater data center up from the bottom of the North Sea and conducted some analysis, researchers also found the servers were eight time more reliable than those on land.

The shipping container sized pod that was recently pulled from 117 feet below the North Sea off Scotland’s Orkney Islands was deployed in June 2018. Throughout the last two years, researchers observed the performance of 864 standard Microsoft data center servers installed on 12 racks inside the pod. During the experiment they also learned more about the economics of modular undersea data centers, which have the ability to be quickly set up offshore nearby population centers and need less resources for efficient operations and cooling. 

Natick researchers assume that the servers benefited from the pod’s nitrogen atmosphere, being less corrosive than oxygen. The non-existence of human interaction to disrupt components also likely added to increased reliability.

The North Sea-based project also exhibited the possibility of leveraging green technologies for data center operations. The data center was connected to the local electric grid, which is 100% supplied by wind, solar and experimental energy technologies. In the future, Microsoft plans to explore eliminating the grid connection altogether by co-locating a data center with an ocean-based green power system, such as offshore wind or tidal turbines.

Snowflake IPO

On September 16, 2020, history was made on the New York Stock Exchange. A software company named Snowflake (ticker: SNOW) made its IPO as the largest publicly traded software company, ever. As one of the most hotly anticipated listing in 2020, Snowflake began publicly trading at $120 per share and almost immediately jumped to $300 per share within a matter of minutes. With the never before seen hike in price, Snowflake also became the largest company to ever double in value on its first day of trading, ending with a value of almost $75 billion. 

What is Snowflake?

So, what exactly does Snowflake do? What is it that makes a billionaire investors like Warren Buffet and Marc Benioff jump all over a newly traded software company? It must be something special right? With all the speculation surrounding the IPO, it’s worth explaining what the company does. A simple explanation would be that Snowflake helps companies store their data in the cloud, rather than in on-site facilities. Traditionally, a company’s data is been stored on-premises on physical servers managed by that company. Tech giants like Oracle and IBM have led the industry for decades. Well, Snowflake is profoundly different. Instead of helping company’s store their data on-premises, Snowflake facilitates the warehousing of data in the cloud. But that’s not all. Snowflake has the capabilities of making the data queryable, meaning it simplifies the process for businesses looking to pull insights from the stored data. This is what sets Snowflake apart from the other data hoarding behemoths of the IT world. Snowflake discovered the secret to separating data storage from the act of computing the data. The best part is that they’ve done this before any of the other big players like Google, Amazon, or Microsoft. Snowflake is here to stay. 

Snowflake’s Leadership

Different than Silicon Valley’s tech unicorns of the past, Snowflake was started in 2012 by three data base engineers. Backed by venture capitalists and one VC firm that wishes to remain anonymous, Snowflake is currently led by software veteran, Frank Slootman. Before taking the reigns at Snowflake, Slootman had great success leading Data Domain and Service Now. He grew Data Domain from just a twenty-employee startup venture to over $1 billion in sales and a $2.4 billion acquisition sale to EMC. I think it’s safe to say that Snowflake is in the right hands, especially if it has any hopes of maturing into its valuation.

Snowflake’s Product Offering

We all know that Snowflake isn’t the only managed data warehouse in the industry. Both Amazon Web Service’s (AWS) Redshift and Google Cloud Platform’s (GCP) BigQuery are very common alternatives. So there had to be something that set Snowflake apart from the competition. It’s a combination of flexibility, service, and user interface. With a database like Snowflake, two pieces of infrastructure are driving the revenue model: storage and computing. Snowflake takes the responsibility of storing the data as well as ensuring the data queries run fast and smooth. The idea of splitting storage and computing in a data warehouse was unusual when Snowflake launched in 2012. Currently, there are query engines like Presto that solely exist just to run queries with no storage included. Snowflake offers the advantages of splitting storage and queries: stored data is located remotely on the cloud, saving local resources for the load of computing data. Moving storage to the cloud delivers lower cost, has higher availability, and provides greater scalability.  

 

Multiple Vendor Options

A majority of companies have adopted a multi-cloud as they prefer not to be tied down to a single cloud provider.  There’s a natural hesitancy to choose options like BigQuery that are subject to a single cloud like Google. Snowflake offers a different type of flexibility, operating on AWS, Azure, or GCP, satisfying the multi-cloud wishes of CIOs. With tech giants battling for domination of the cloud, Snowflake is in a sense the Switzerland of data warehousing. 

Learn more about a multi-cloud approach

Top of Form

Bottom of Form

 

Snowflake as a Service

When considering building a data warehouse, you need to take into account the management of the infrastructure itself. Even when farming out servers to a cloud provider, decisions like the right size storage, scaling to growth, and networking hardware come into play. Snowflake is a fully managed service. This means that users don’t need to worry about building any infrastructure at all. Just put your data into the system and query it. Simple as that. 

While fully managed services sound great, it comes at a cost. Snowflake users need to be deliberate about storing and querying their data as fully managed services are pricey. If deciding whether to build or buy your data warehouse, it would be wise to compare Snowflake ownership’s total cost to building something themselves.

 

Snowflake’s User Interface and SQL Functionality

Snowflake’s UI for querying and exploring tables is as easy on the eyes as it to use. Their SQL functionality is also a strong touching point. (Structured Query Language) is the programming language that developers and data scientists use to query their databases. Each database has slightly different details, wording, and structure. Snowflake’s SQL seems to have collected the best from all of the database languages and added other useful functions. 

 

A Battle Among Tech Giants

As the proverb goes, competition creates reason for caution. Snowflake is rubbing shoulders with some of the world’s largest companies, including Amazon, Google, and Microsoft. While Snowflake has benefited from an innovative market advantage, the Big Three are catching up quickly by creating comparable platforms.

However, Snowflake is dependent on these competitors for data storage. They’ve only has managed to thrive by acting as “Switzerland”, so customers don’t have to use just one cloud provider. As more competition enters the “multicloud” service industry, nonalignment can be an advantage, but not always be possible. Snowflake’s market share is vulnerable as there are no clear barriers to entry for the industry giants, given their technical talent and size. 

Snowflake is just an infant in the public eye and we will see if it sinks or swims over the next year or so. But with brilliant leadership, a promising market, and an extraordinary track record, Snowflake may be much more than a one hit wonder. Snowflake may be a once in a lifetime business.

DTC – A True Partnership

For Over Half of a Century We’ve Been Committed to Serving IT Departments and Saving IT Budgets 

 

Our Story

In 1965, we opened our doors for business with the idea to transform the IT equipment industry through technology, transparency, standards, and processes. We planted our roots as a round reel tape company in Downey, CA. As a family owned and operated business over the past 50 years, we have sprouted into one of the most trustworthy, reliable, and authoritative organizations in the industry. 

From disk pack tape storage and round reel tape to hard drives, networked storage, tape libraries, and cloud backup systems; our business and partnerships continue to prosper and grow with the constantly innovative IT industry. DTC proudly works with all organizations, letting our reputation speak for itself.

DTC’s 3 Point Message is Simple:

 

  • Our goal is to reach 100% Recyclability of old storage media and IT assets.

 

Electronics recycling is our bread and butter. We’ve been both saving the environment and companies money, by setting the standard for secure handling and re purposing of used and obsolete electronics. Recycling of electronics and IT equipment is an essential part of a company’s waste management strategy. If you are looking for a safe and secure way of electronics recycling, then you should consider our proven services. We specialize in ethical disposal and reprocessing of used and obsolete electronics and computer equipment. We can help accomplish legal and conservational goals as a responsible organization. Let us be the solution to your problem and help your organization stay socially responsible. 

 

Learn more about recycling your old IT assets

 

  • Our pledge since day one has been to keep your data safe.

 

Data security is main concern for IT departments in any organization, and rightly so. Many of our partners demand that their data is handled appropriately and destroyed according to both government and industry standards. DTC provides honest and secure data destruction services which include physical destruction with a mobile shredder and secure data erasure methods like degaussing. All of our destruction services are effective, auditable, and certified. Ship storage assets to our secured facility or simply ask for the mobile data destroyer to be deployed on site. With over 50 years of service, we’ve never had one data leak. Now that’s experience you can trust!

Learn more about DTC data security

 

  • Our process will help you save time and money.

 

Our IT asset disposition (ITAD) process will help your organization recoup dollars from your surplus, used IT Assets and free up storage space at your facility. Our equipment buyback program is dedicated to purchasing all types of surplus and used data storage and IT equipment. We use the highest standards to ensure you get the greatest return your initial IT investment. With the current pace of hardware evolution, most companies are upgrading their systems every two years. This leads to a lot of surplus IT equipment. DTC has the experience and resources to get you the most for your old IT assets.

Get the most return on your IT investment 

The Value We Provide

DTC’s diverse knowledge-base and experiences, allow our partners to utilize our purchasing and sales personnel as a valued resource for questions, research, and answers. Our vast database and the contact list of customers, resellers, recyclers, suppliers, and industry partners allows us to excellent pricing when sourcing your IT Equipment. Don’t believe us? Let us know what you need, and we will find it for you. 

How we can help you?

Here is brief list of services we provide:

 

Ready to work with a trusted partner? Contact Us Today



The TikTok Controversy: How Much Does Big Tech Care About Your Data and its Privacy?

If you have a teenager in your house, you’ve probably encountered them making weird dance videos in front of their phone’s camera. Welcome to the TikTok movement that’s taking over our nation’s youth. TikTok is a popular social media video sharing app that continues to make headlines due to cybersecurity concerns. Recently, the U.S. military banned its use on government phones following a warning from the DoD about potential personal information risk. TikTok has now verified that it patched multiple vulnerabilities that exposed user data. In order to better understand TikTok’s true impact on data and data privacy, we’ve compiled some of the details regarding the information TikTok gathers, sends, and stores.

What is TikTok?

TikTok is a video sharing application that allows users to create short, fifteen-second videos on their phones and post the content to a public platform. Videos can be enriched with music and visual elements, such as filters and stickers. By having a young adolescent demographic, along with the content that is created and shared on the platform, have put the app’s privacy features in the limelight as of late. Even more so, questions the location of TikTok data storage and access have raised red flags.

You can review TikTok’s privacy statement for yourself here.

TikTok Security Concerns

Even though TikTok allows users to control who can see their content, the app does ask for a number of consents on your device. Most noteworthy, it accesses your location and device information. However, there’s no evidence to support the theory of malicious activity or that TikTok is violating their privacy policy, it is still advised to practice caution with the content that’s both created and posted.

The biggest concern surrounding the TikTok application is where user information is stored and who has access to it. According the TikTok website, “We store all US user data in the United States, with backup redundancy in Singapore. Our data centers are located entirely outside of China, and none of our data is subject to Chinese law.” “The personal data that we collect from you will be transferred to, and stored at, a destination outside of the European Economic Area (“EEA”).” There is no other specific information regarding where user data is stored.

Recently, TikTok published a Transparency Report which lists “legal requests for user information”, “government requests for content removal”, and “copyrighted content take-down notices”. The “Legal Requests for User Information” shows that India, the United States, and Japan are the top three countries where user information was requested. The United States was the number one country with fulfilled request (86%) and number of accounts specified in the requests (255). Oddly enough, China is not listed as having received any requests for user information. 

What Kind of Data is TikTok Tracking?

Below are some of the consents TikTok requires on Android and iOS devices after installation of the app is completed. While some of the permissions are to be expected, these are all consistent with TikTok’s written privacy policy. When viewing all that TikTok gathers from its users, it can be alarming. In short, the app allows TikTok to:

  • Access the camera (and take pictures/video), the microphone (and record sound), the device’s WIFI connection, and the full list of contacts on your device.
  • Determine if the internet is available and access it if it is.
  • Keep the device turned on and automatically start itself.
  • Secure detailed information on the user’s location using GPS.
  • Read and write to the device’s storage, install/remove shortcuts, and access the flashlight (turn it off and on).

You read that right, TikTok has full access to your audio, video, and list of contacts in your phone. The geo location tracking via GPS is somewhat surprising though, especially since TikTok videos don’t display location information. So why collect that information? If you operate and Android device, TikTok has the capability of accessing other apps running at the same time, which can give the app access to data in another app such as a banking or password storage app. 

Why is TikTok Banned by the US Military?

In December 2019, the US military started instructing soldiers to stop using TikToK on all government-owned phones. This TikTok policy reversal came just shortly after the release of a Dec. 16 Defense Department Cyber Awareness Message classifying TikTok as having potential security risks associated with its use. As the US military cannot prevent government personnel from accessing TiKTok on their personal phones, the leaders recommended that service members use caution if unfamiliar text messages are received.

In fact, this was not the first time that the Defense Department had been required to encourage service members to remove a popular app from their phones. In 2016, the Defense Department banned the augmented-reality game, Pokémon Go, from US military owned smartphones. However, this case was a bit different as military officials alluded to concerns over productivity and the potential distractions it could cause. The concerns over TikTok are focused on cybersecurity and spying by the Chinese government.

In the past, the DoD has put out more general social media guidelines, advising personnel to proceed with caution when using any social platform. And all DoD personnel are required to take annual cyber awareness training that covers the threats that social media can pose.

3-2-1 Backup Rule

What is the 3-2-1 Backup Rule?

 

The 3-2-1 backup rule is a concept made famous by photographer Peter Krogh. He basically said there are two types of people: those who have already had a storage failure and those who will have one in the future. Its inevitable. The 3-2-1 backup rule helps to answer two important questions: how many backup files should I have and where should I store them?

The 3-2-1 backup rule goes as follows:

  • Have at least three copies of your data.
  • Store the copies on two different media.
  • Keep one backup copy offsite.

Need help with building your data backup strategy?

  1. Create at least THREE different copies of your data

Yes, I said three copies. That means that in addition to your primary data, you should also have at least two more backups that you can rely on if needed. But why isn’t one backup sufficient you ask? Think about keeping your original data on storage device A and its backup is on storage device B. Both storage devices have the same characteristics, and they have no common failure causes. If device A has a probability of failure that’s 1/100 (and the same is true for device B), then the probability of failure of both devices at the same time is 1/10,000.

So with THREE copies of data, if you have your primary data (device A) and two backups of it (device B and device C), and if all devices have the same characteristics and no common failure causes, then the probability of failure of all three devices at the same time will be 1/1,000,000 chance of losing all of your data. That’s much better than having only one copy and a 1/100 chance of losing it all, wouldn’t you say? Creating more than two copies of data also avoids a situation where the primary copy and its backup copy are stored in the same physical location, in the event of a natural disaster.

  1. Store your data on at least TWO different types of media

Now in the last scenario above we assumed that there were no common failure causes for all of the devices that contain your precious data. Clearly, this requirement is much harder to fulfill if your primary data and its backup are located in the same place. Disks from the same RAID aren’t typically independent. Even more so, it is not uncommon to experience failure of one or more disks from the same storage compartment around the same time.

This is where the #2 comes in 3-2-1 rule. It is recommended that you keep copies of your data on at least TWO different storage types. For example, internal hard disk drives AND removable storage media such as tapes, external hard drives, USB drives, od SD-cards. It is even possible to keep data on two internal hard disk drives in different storage locations.

 

Learn more about purchasing tape media to expand your data storage strategy 

  1. Store at least ONE of these copies offsite

Believe it or not, physical separation between data copies is crucial. It’s bad idea to keep your external storage device in the same room as your primary storage device. Just ask the numerous companies that are located in the path of a tornado or in a flood zone. Or what would you do if your business caught fire? If you work for a smaller company with only one location, storing your backups to the cloud would be a smart alternative. Tapes that are stored at an offsite location are also popular among companies of all sizes.

 

Every system administrator should have a backup. This principle works for any virtual environment; regardless of the system you are running, backup is king!

3 Things Star Wars Taught Us About Data Storage

A long time ago in a galaxy far, far away, a farm boy on a desert planet joined an uprising while saving a princess from a dark lord. Just like that, one of the most popular cinematic stories of all time was created. What began with one film in 1977 rapidly developed into one of the most successful media empires in existence. Now, more than four decades after it was introduced to audiences worldwide, Star Wars remains a global pop culture sensation. 

While doing my initial research for this article I came across quite an amazing factoid: George Lucas is truly a data backup and cloud storage buff! Knowing this bit of information, I began to see it throughout the Star Wars narrative. In the wake of this newfound knowledge and inspiration, I’d urge you to weigh the following to make sure your organization doesn’t endure the same struggles that Darth and the Skywalkers have over the course of their adventures.

Have a Data Security Plan in Place

It is pretty common knowledge that a decent data backup strategy starts with a strong approach to data security. With a data security plan in place, an organization can considerably reduce the chances of relying solely on backups. Sadly, the most minimal types of security appeared to be neglected throughout the Star Wars trilogy.

The first eight installments of the trilogy are rampant with data security concerns, but possibly the most noticeable happens when Obi-Wan looks into the origins of a Kamino Saberdart. While looking for the planet Kamino in the Jedi Archives, he discovers a blank space. Yoda concludes to Obi-Wan that the planet has likely been deleted from the archives. 

Some theories state that Jedi training was a specific type of password protection since the Force is required to manipulate the Jedi Archives. This wouldn’t be very wise considering there were numerous of trained Jedi in the galaxy, and their sworn enemies were force users. A great current day example would be both Google and Amazon offices sharing the same keycards. Not exactly infosec friendly to say the least! The Jedi had weak passwords with absolutely no permissions management.

In an effort to prevent cyber criminals from potentially accessing your company’s vital data, it is good practice to perform regular reviews of your data security strategies. Double check that there are no glaring holes in your data security strategy, habitually change passwords, use two factor authentication, and ALWAYS use encryption. 

Have a Backup Plan in Place

Having a data backup plan in place is a key complement to any security strategy. As the Jedi learned, even when your security is set up perfectly, disaster can strike. Inadequate security management on both sides led to the destruction of 6 planets, the mutilation of 2 others, and the obliteration of a handful of super weapons.

The best approach if such a plan is referred to as a 3-2-1 backup strategy. This means for every piece of data, you have the data itself, a backup copy on site (an external hard drive of some sort), and a final copy in the cloud. Now, I’ve come across several question regarding the use of a specific data storage medium used in the trilogy. So why does the Star Wars universe use such a basic technology as data-tapes? Simply put, backups are better on tape. The fewer moving parts an object has, the more robust it is. Additionally, the long-term life span of electronic memory is a dilemma, compared to that of tape. When you make a backup of something, you want it to be able to survive a long time. 

We first see the blueprints for the super weapon when Count Dooku, decides that they would be in safe with Darth Sidious on Scarif. By the time Jynn Erso hears of the Stardust version of the plans for the Death Star, it appears that Scarif is the only place in the Galaxy where you could find a copy of the plans. In a way, the plans on Scarif served as the Empire’s cloud storage. All the Death Star needed was an external hard drive following it through space with an additional copy of the plans!

If you only have one backup, it’s better than nothing. Not much better, but it is. Make certain that you’re using a 3-2-1 approach for a better backup strategy. Also consider a data backups allocated to different geographic regions in case of a natural disaster. Tape is great for this!

Have a Version Control Plan in Place

What good is to have backups if you don’t have the right information backed up? If we came away with one thing from the plans used to defeat the first Death Star, its that the Empire didn’t manage their version control. The backups that they had were not up to date. The focus lens for the superlaser is equatorial when the Death Star’s superlaser was actually on the northern hemisphere.

To operate a reliable backup solution, it needs to run on a consistent basis. Sometimes daily or even more frequently depending on the situation and data on hand. Whatever backup strategy the Death Star was using, it had clearly gone askew. 

Version History is also a critical piece to the backup puzzle. Version History lets users keep multiple versions of a file over extended periods of time, sometimes forever. Think if the Empire had set up a Version History. They could have reverted to the pre-final plans to destroy the Death Star.

Whether you manage a small business or a universal enterprise, your data (assuming its properly secured and backed up) can mean the difference between domination and having your battle station blown into a million pieces across the dark void of space. Data security and backup doesn’t need to be a battle. Create a plan that works for you, ensure your data is secured, and verify every so often that it’s up to date with the most recent versions. May The Data Be With You.

Industries We Serve

DTC serves a wide range of industries from healthcare, finance, education, and even small business. In today’s rapidly evolving and data focused world, every trade has a need to keep its data safe and its systems running.  Our pledge since day one has been to keep your data safe, no matter what industry you’re in. 

 

MAKING AN IMPACT IN EVERY INDUSTRY

At DTC, we work behind the scenes to solve your IT equipment challenges, improve data security, and reduce environmental impact of recycled assets. Whether your organization requires extensive data center services, IT asset disposition, e-waste recycling, on-site data destruction, or just general maintenance and support; we’ve got you covered. For over 50 years, we’ve made a name for ourselves as one of the most trusted brands in our industry, and we want to help you do the same. See below how we can help your industry.

HEALTHCARE

With extensive databases of medical records containing vital information, data maintenance, storage, and security is an essential part of healthcare. Regardless of how heavily healthcare institutions depend on computers and other electronic devices, not many organizations have a dedicated IT department with the ability to handle the vast amount of equipment requirements. Our IT equipment specialists can help with your organizations specific needs. We can flawlessly help save your healthcare institute’s budget, while you help save lives.

A plan for guarding against ransomware in the healthcare industry

Finance

Another data breach is the last thing any financial institution needs. Most financial organizations have highly developed IT professionals running their data infrastructures, but upgrades and disposal can become overwhelming. Industry best practices state that companies need to keep their data for several decades and should migrate the data to the latest software version of their backup solution. That’s where we come into play. Since 1965, we have been helping businesses, both large and small, keep their data secure while increasing ROI on used IT equipment.

Get Help With Securing Your Financial Data

 

Government

Today more than ever, government agencies depend on computers and electronics in order to effectively operate within their respective borders. Most governments, whether city, county, state or nationwide, have an extremely long holding period for sensitive, proprietary, or administration-critical data. Even though data tape is the most reliable medium for long-term data storage, they still need to be updated over time. More importantly, we are aware of the importance of data security when it comes to sensitive information on your used equipment.

 

Education

The recent COVID-19 pandemic has given us all a firsthand look at how classrooms across the country are integrating 1:1 student to computer learning. Now more than ever, IT equipment has become heavily integrated into school settings. Computer learning means a vast increase in information and data creation; requiring substantial data back up and security measures to protect it. Regardless of how heavily learning institutions depend on computers and other electronic devices, not many school districts have a dedicated IT department with the ability to handle the vast amount of data back up and equipment requirements.

Learn More About Saving IT Budgets in the Education Field

Energy and Utilities

Companies in the energy and utility industries are continuously being asked to find ways to reduce costs and improve efficiency. Whether it’s the power generation and alternative energy services, most companies have a very busy IT department. When the computers, servers, and data storage centers need to be upgraded and replaced; it can be exhausting. The right IT asset disposal (ITAD) partner can help get the most value from old IT equipment; resulting in maximum return on initial investment.

Learn More About ITAD and How it can Help You

Small and Mid-size Business

Entrepreneurship, innovation, and job creation are not only the foundation of a healthy economy but the backbone of America. Entrepreneurs embody the strength and character that help make our country great and we’d like to keep it that way. As a family-owned business since 1965, we understand the struggles and sacrifices of being a business owner. Small businesses and owners may not always have the budget for the latest upgrades in IT equipment or even know what to do with their older equipment. We can help with upgrades and disposal of your data storage tapes and other IT assets when needed. 

OUR GOAL IS TO BE YOUR MOST VALUED PARTNER

DTC’s IT equipment specialists and procurement associates are made up of over 130+ years of combined experience making us one of the industry’s best-trained teams. We opened our doors for business in 1965 with the idea to transform the IT lifecycle through technology, transparency, standards, and processes. Our business continues to flourish and grows with the ever-changing IT industry, letting our reputation speak for itself.

Interested in Learning More About Who We Are?

Send a message, we’d love to hear from you.

Scroll to top