Blog

533 Million Facebook Users Data Breached


Facebook is by far the largest and most popular social media platform used today. With 2.8 billion users and .84 billion daily active users, it controls nearly 59% of the social media market. With that many users, one can only imagine the amount of data produced and collected by Facebook every second. A majority of the data collected is personal information on its users. The social tech platform collects its user’s names, birthdays, phone numbers, email addresses, locations, and in some cases photo IDs. All of this information can be maliciously used if it got into the wrong hands, which is why numerous people are worried about the latest Facebook data breach. 

Microsoft Exchange Server Hack – Everything You Should Know



What happened with the Facebook Data Leak?

The most recent Facebook data leak was exposed by a user in a low-level hacking forum who published the phone numbers and personal data of hundreds of millions of Facebook users for free. The exposed data includes the personal information of over 533 million Facebook users from 106 countries. The leaked data contains phone numbers, Facebook IDs, full names, locations, birthdates, bios, and, in some cases, email addresses.

The leak was discovered in January when a user in the same hacking forum advertised an automated bot that could provide phone numbers for hundreds of millions of Facebook users for a price. A Facebook spokesperson is claiming that the data was scraped because of a vulnerability that the company patched in 2019. Data scraping is a technique in which a computer program extracts data from human-readable output coming from another program. The vulnerability uncovered in 2019 allowed millions of phone numbers to be scraped from Facebook’s servers in violation of its terms of service. Facebook said that vulnerability was patched in August 2019.

However, the scraped data has now been posted on the hacking forum for free, making it available to anyone with basic data skills. The leaked data could be priceless to cybercriminals who use people’s personal information to impersonate them or scam them into handing over login credentials.

Who’s Running on AWS – Featuring Twitter



What caused the Facebook data breach?

When Facebook was made aware of the data exposed on the hacking forum, they were quick to say that the data is old from a break that occurred in 2019. Basically, they’re saying this is nothing new, the data has been out there for some time now and they patched the vulnerability in their system. In fact, the data, which first surfaced back in 2019, came from a breach that Facebook did not disclose in any significant detail at the time. Facebook never really let this data breach be publicly known. 

Uncertainty with Facebook’s explanation comes from the fact that they had a number of breaches and exposures from where the data could have come from. Here is a list of recent Facebook “data leaks” in recent years:

  • April 2019 – 540 million records exposed by a third party and disclosed by the security firm UpGuard
  • September 2019 – 419 million Facebook user records scraped from the social network by bad actors before a 2018 Facebook policy change
  • 2018 – Cambridge Analytica third-party data sharing scandal
  • 2018 – Facebook data breach that compromised access tokens and virtually all personal data from about 30 million users

Facebook eventually explained that the most recent data exploit of 533 million user accounts is a different data set that attackers created by abusing a flaw in a Facebook address book contacts import feature. Facebook says it patched the weak point in August 2019, but it’s uncertain how many times the bug was exploited before then.



How can you find out if your personal information is part of the Facebook breach?

With so much personal information on social media today, you’d expect the tech giants to have a strong grip on their data security measures. With the latest Facebook breach, a large amount of data was exposed including full names, birthdays, phone numbers, and locations. Facebook says that the data leak originated from an issue in 2019, which has since been fixed. Regardless, there’s no way to reclaim that data. A third-party website, haveibeenpwned.com, makes it easy to check if you’re data was part of the leaked information. Simply, input your email to find out.  Though 533 million Facebook accounts were included in the breach, only 2.5 million of those included emails in the stolen data. That means you’ve got less than a half-percent chance of showing up on that website. Although this data is from 2019, it could still be of value to hackers and cybercriminals like those who take part in identity theft. This should serve as a reminder to not share any personal information on social media that you don’t want a stranger to see.



FEATURED

HPE and NASA Launch SBC-2 into Orbit


To infinity and beyond! That’s where Microsoft and HPE are planning on taking Azure cloud computing as it heads to the International Space Station (ISS). 

On February 20, HPE’s Spaceborne Computer-2 (SBC-2), launched to the ISS onboard Northrop Grumman’s robotic Cygnus cargo ship. The mission will bring edge computing, artificial intelligence capabilities, and a cloud connection to orbit on an integrated platform. Spaceborne Computer-2 will be installed on the ISS for the next two to three years. It’s hoped the edge computing system will enable astronauts to eliminate latency associated with sending data to and from Earth, tackle research, and gain insights immediately for real-time projects.

Meet Summit: The IBM Supercomputer

HPE anticipates the supercomputer to be used for experiments such as processing medical imaging and DNA sequencing, to unlocking key insights from volumes of remote sensors and satellites. Also, in mind for HPE when the IT equipment was delivered to the ISS was whether non-IT-trained astronauts could install it and connect it up to the power, the cooling, and the network. If that went well, the next question was whether it would work in space or not.

This isn’t NASA’s first rodeo when it comes to connecting cloud computing services to the ISS. In 2019, Amazon Web Services participated in a demonstration that used cloud-based processing to distribute live video streams from space. Surprisingly, it isn’t HPE’s first time either. In 2017, they sent up its first Spaceborne Computer, which demonstrated supercomputer-level processing speeds over a teraflop. Spaceborne computing has come a long way over the years, and now is a perfect time for the Microsoft-HPE collaboration. Recently, Microsoft extended its cloud footprint to the final frontier with Azure Space.



Microsoft Support HPE’s Spaceborne Computer with Azure

Microsoft and HPE are partnering to bring together Azure and the Spaceborne Computer-2 supercomputer, making it the ultimate edge-computing device. Microsoft and HPE said they’ll be working together to connect Azure to HPE’s Spaceborne Computer-2. The pair are touting the partnership as bringing compute and AI capabilities to the ultimate edge computing device.

Cloud Computing is Out of This World: Microsoft and SpaceX Launch Azure Space

Originally, HP and NASA partnered to build the Spaceborne Computer, described as an off-the-shelf supercomputer. The HPE Spaceborne Computer-2 is designed to simulate computation loads during space travel via data-intensive applications. By handling processing in space, we will be able to gain new information and research advancements in areas never seen before. The HP-Microsoft Spaceborne announcement is an expansion of Microsoft’s Azure Space initiative. Azure Space is a set of products, plus newly announced partnerships designed to position Azure as a key player in the space- and satellite-related connectivity/compute part of the cloud market.

Spaceborne Computer-2 is purposely engineered for harsh edge environments. Combine the power of the edge with the power of the cloud, SBC-2 will be connected to Microsoft Azure via NASA and HPE ground stations. HPE and Microsoft are gauging SBC-2’s edge computing capabilities and evolving machine-language models to handle a variety of research challenges. They are hopeful that the new supercomputer can eventually aid in anticipation of dust storms that could prevent future Mars missions and how to use AI-enhanced ultrasound imaging to make in-space medical diagnoses. 

Though SBC-2 will be used for research projects for two to three years, HPE and the ISS National Lab are taking requests. Do you have something you’d like to see measured in space? Let them know!


FEATURED

NHL Partners with AWS (Amazon) for Cloud Infrastructure

NHL Powered by AWS

“Do you believe in miracles? Yes!” This was ABC sportscaster Al Michaels’ quote “heard ’round the world” after the U.S. National Team beat the Soviet National Team at the 1980 Lake Placid Winter Olympic Games to advance to the medal round. One of the greatest sports moments ever that lives in infamy among hockey fans is readily available for all of us to enjoy as many times as we want thanks to modern technology. Now the National Hockey League (NHL) is expanding their reach with technology as they announced a partnership with Amazon Web Services (AWS). AWS will become the official cloud storage partner of the league, making sure all historical moments like the Miracle on Ice are never forgotten.

The NHL will rely on AWS exclusively in the areas of artificial intelligence and machine learning as they look to automate video processing and content delivery in the cloud. AWS will also allow them to control the Puck and Player Tracking (PPT) System to better capture the details of gameplay. Hockey fans everywhere are in for a treat!

What is the PPT System?

The NHL has been working on developing the PPT system since 2013. Once it is installed in every team’s arena in the league, the innovative system will require several antennas in the rafters of the arenas, tracking sensors placed on every player in the game, and tracking sensors built into the hockey pucks. The hockey puck sensors can be tracked up to 2,000 times per second to yield a set of coordinates that can then turn into new results and analytics.

The Puck Stops Here! Learn how the NHL’s L.A. Kings use LTO Tape to build their archive.

How Will AWS Change the Game?

AWS’s state-of-the-art technology and services will provide us with capabilities to deliver analytics and insights that highlight the speed and skill of our game to drive deeper fan engagement. For example, a hockey fan in Russia could receive additional stats and camera angles for a major Russian player. For international audiences that could be huge. Eventually, personalized feeds could be possible for viewers who would be able to mix and match various audio and visual elements. 

The NHL will also build a video platform on AWS to store video, data, and related applications into one central source that will enable easier search and retrieval of archival video footage. Live broadcasts will have instant access to NHL content and analytics for airing and licensing, ultimately enhancing broadcast experiences for every viewer. Also, Virtual Reality experiences, Augmented Reality-powered graphics, and live betting feeds are new services that can be added to video feeds.

As part of the partnership, Amazon Machine Learning Solutions will cooperate with the league to use its tech for in-game video and official NHL data. The plan is to convert the data into advanced game analytics and metrics to further engage fans. The ability for data to be collected, analyzed, and distributed as fast as possible was a key reason why the NHL has partnered with AWS.

The NHL plans to use AWS Elemental Media to develop and manage cloud-based HD and 4K video content that will provide a complete view of the game to NHL officials, coaches, players, and fans. When making a crucial game-time decision on a penalty call the referees will have multi-angle 4k video and analytics to help them make the correct call on the ice. According to Amazon Web Services, the system will encode, process, store, and transmit game footage from a series of camera angles to provide continuous video feeds that capture plays and events outside the field of view of traditional cameras.

The NHL and AWS plan to roll out the new game features slowly throughout the next coming seasons, making adjustments along the way to enhance the fan experience. As one of the oldest and toughest sports around, hockey will start to have a new sleeker look. Will all the data teams will be able to collect, we should expect a faster, stronger, more in-depth game. Do you believe in miracles? Hockey fans sure do!

FEATURED

Open Source Software

Open-source Software (OSS)

Open-source software often referred to as (OSS), is a type of computer software in which source code is released under a license. The copyright holder of the software grants users the rights to use, study, change and distribute the software as they choose. Originating from the context of software development, the term open-source describes something people can modify and share because its design is publicly accessible. Nowadays, “open-source” indicates a wider set of values known as “the open-source way.” Open-source projects or initiatives support and observe standards of open exchange, mutual contribution, transparency, and community-oriented development.

What is the source code of OSS?

The source code associated with open-source software is the part of the software that most users don’t ever see. The source code refers to the code that the computer programmers can modify to change how the software works. Programmers who have access to the source code can develop that program by adding features to it or fix bugs that don’t allow the software to work correctly.

If you’re going to use OSS, you may want to consider also using a VPN. Here are our top picks for VPNs in 2021.

Examples of Open-source Software

For the software to be considered open-source, its source code must be freely available to its users. This allows its users the ability to modify it and distribute their versions of the program. The users also have the power to give out as many copies of the original program as they want. Anyone can use the program for any purpose; there are no licensing fees or other restrictions on the software. 

Linux is a great example of an open-source operating system. Anyone can download Linux, create as many copies as they want, and offer them to friends. Linux can be installed on an infinite number of computers. Users with more knowledge of program development can download the source code for Linux and modify it, creating their customized version of that program. 

Below is a list of the top 10 open-source software programs available in 2021.

  1. LibreOffice
  2. VLC Media Player
  3. GIMP
  4. Shotcut
  5. Brave
  6. Audacity
  7. KeePass
  8. Thunderbird
  9. FileZilla
  10. Linux

Setting up Linux on a server? Find the best server for your needs with our top 5.

Advantages and Disadvantages of Open-source Software

Similar to any other software on the market, open-source software has its pros and cons. Open-source software is typically easier to get than proprietary software, resulting in increased use. It has also helped to build developer loyalty as developers feel empowered and have a sense of ownership of the end product. 

Open-source software is usually a more flexible technology, quicker to innovation, and more reliable due to the thousands of independent programmers testing and fixing bugs of the software on a 24/7 basis. It is said to be more flexible because modular systems allow programmers to build custom interfaces or add new abilities to them. The quicker innovation of open-source programs is the result of teamwork among a large number of different programmers. Furthermore, open-source is not reliant on the company or author that originally created it. Even if the company fails, the code continues to exist and be developed by its users. 

Also, lower costs of marketing and logistical services are needed for open-source software. It is a great tool to boost a company’s image, including its commercial products. The OSS development approach has helped produce reliable, high-quality software quickly and at a bargain price. A 2008 report by the Standish Group stated that the adoption of open-source software models has resulted in savings of about $60 billion per year for consumers. 

On the flip side, an open-source software development process may lack well-defined stages that are usually needed. These stages include system testing and documentation, both of which may be ignored. Skipping these stages has mainly been true for small projects. Larger projects are known to define and impose at least some of the stages as they are a necessity of teamwork. 

Not all OSS projects have been successful either. For example, SourceXchange and Eazel both failed miserably. It is also difficult to create a financially strong business model around the open-source concept. Only technical requirements may be satisfied and not the ones needed for market profitability. Regarding security, open-source may allow hackers to know about the weaknesses or gaps of the software more easily than closed source software. 

Benefits for Users of OSS

The most obvious benefit of open-source software is that it can be used for free. Let’s use the example of Linux above. Unlike Windows, users can install or distribute as many copies of Linux as they want, with limitations. Installing Linux for free can be especially useful for servers. If a user wants to set up a virtualized cluster of servers, they can easily duplicate a single Linux server. They don’t have to worry about licensing and how many requests of Linux they’re authorized to operate.

An open-source program is also more flexible, allowing users to modify their own version to an interface that works for them. When a Linux desktop introduces a new desktop interface that some users aren’t supporters of, they can modify it to their liking. Open-source software also allows developers to “be their own creator” and design their software. Did you know that Witness Android and Chrome OS, are operating systems built on Linux and other open-source software? The core of Apple’s OS X was built on open-source code, too. When users can manipulate the source code and develop software tailored to their needs, the possibilities are truly endless.

FEATURED

Malvertising Simply Explained

What is Malvertising?

Malvertising (a combination of the two words “malicious and advertising”) is a type of cyber tactic that attempts to spread malware through online advertisements. This malicious attack typically involves injecting malicious or malware-laden advertisements into legitimate online advertising networks and websites. The code then redirects users to malicious websites, allowing hackers to target the users. In the past, reputable websites such as The New York Times Online, The London Stock Exchange, Spotify, and The Atlantic, have been victims of malvertising. Due to the advertising content being implanted into high-profile and reputable websites, malvertising provides cybercriminals a way to push their attacks to web users who might not otherwise see the ads because of firewalls or malware protection.

Online advertising can be a pivotal source of income for websites and internet properties. With such high demand, online networks have become extensive in to reach large online audiences. The online advertising network involves publisher sites, ad exchanges, ad servers, retargeting networks, and content delivery networks.  Malvertising takes advantage of these pathways and uses them as a dangerous tool that requires little input from its victims.

Protect your business’s data by setting up a zero-trust network. Find out how by reading the blog.

How Does Malvertising Get Online?

There are several approaches a cybercriminal might use, but the result is to get the user to download malware or direct the user to a malicious server. The most common strategy is to submit malicious ads to third-party online ad vendors. If the vendor approves the ad, the seemingly innocent ad will get served through any number of sites the vendor is working with. Online vendors are aware of malvertising and actively working to prevent it. That is why it’s important to only work with trustworthy, reliable vendors for any online ad services.

What is the Difference Between Malvertising and Adware?

As expected, Malvertising can sometimes be confused with adware. Where Malvertising is malicious code intentionally placed in ads, adware is a program that runs on a user’s computer. Adware is usually installed hidden inside a package that also contains legitimate software or lands on the machine without the knowledge of the user. Adware displays unwanted advertising, redirects search requests to advertising websites, and mines data about the user to help target or serve advertisements.

Some major differences between malvertising and adware include:

  • Malvertising is a form of malicious code deployed on a publisher’s web page, whereas adware is only used to target individual users.
  • Malvertising only affects users viewing an infected webpage, while Adware operates continuously on a user’s computer.

Solarwinds was the biggest hack of 2020. Learn more about how you may have been affected.

What Are Some Examples of Malvertising?

The problem with malvertising is that it is so difficult to spot. Frequently circulated by the ad networks we trust, companies like Spotify and Forbes have both suffered as a result of malvertising campaigns that infected their users and visitors with malware. Some more recent examples of malvertising are RoughTed and KS Clean. A malvertising campaign first reported in 2017, RoughTed was particularly significant because it was able to bypass ad-blockers. It was also able to evade many anti-virus protection programs by dynamically creating new URLs. This made it harder to track and deny access to the malicious domains it was using to spread itself.

Disguised as malicious adware contained or hidden within a real mobile app, KS Clean targeted victims through malvertising ads that would download malware the moment a user clicked on an ad. The malware would silently download in the background.  The only indication that anything was off was an alert appearing on the user’s mobile device saying they had a security issue, prompting the user to upgrade the app to solve the problem. When the user clicks on ‘OK’, the installation finishes, and the malware is given administrative privileges. These administrative privileges permitted the malware to drive unlimited pop-up ads on the user’s phone, making them almost impossible to disable or uninstall.

How Can Users Prevent Malvertising?

While organizations should always take a strong position against any instances of unwarranted attacks, malvertising should high on the priority list for advertising channels. Having a network traffic analysis in the firewall can help to identify suspicious activity before malware has a chance to infect the user.  

Some other tips for preventing malvertising attacks include the following:

  • Employee training is the best way to form a proactive company culture that is aware of cyber threats and the latest best practices for preventing them. 
  • Keep all systems and software updated to include the latest patches and safest version.
  • Only work with trustworthy, reliable online advertising vendors.
  • Use online ad-blockers to help prevent malicious pop-up ads from opening a malware download.
FEATURED

TOP 5 VPN’S OF 2021

In today’s working environment, no one knows when remote work will be going away, if at all.  This makes remote VPN access all the more important for protecting your privacy and security online. As the landscape for commercial VPNs continues to grow, it can be a daunting task to sort through the options to find the best VPN to meet your particular needs. That’s exactly what inspired us to write this article. We’ve put together a list of the five best and most reliable VPN options for you.

What is a VPN and why do you need one?

A VPN is short for a virtual private network. A VPN is what allows users to enjoy online privacy and obscurity by creating a private network from a public internet connection. A VPN disguises your IP address, so your online actions are virtually untraceable. More importantly, a VPN creates secure and encrypted connections to provide greater privacy than a secured Wi-Fi hotspot can.

Think about all the times you’ve read emails while sitting at the coffee shop or checking the balance in your bank account while eating a restaurant. Unless you were logged into a private network that required a password, any data transmitted on your device could be exposed. Accessing the web on an unsecured Wi-Fi network means you could be exposing your private information to nearby observers. That’s why a VPN, should be a necessity for anyone worried about their online security and privacy. The encryption and privacy that a VPN offers, protect your online searches, emails, shopping, and even bill paying. 

Take a look at our top 5 server picks for 2021.

Our Top 5 List of VPN’s for 2021

ExpressVPN

  • Number of IP addresses: 30,000
  • Number of servers: 3,000+ in 160 locations
  • Number of simultaneous connections: 5
  • Country/jurisdiction: British Virgin Islands
  • 94-plus countries

ExpressVPN is powered by TrustedServer technology, which was built to ensure that there are never any logs of online activities. In the privacy world, ExpressVPN has a solid track record, having faced a server removal by authorities which proved their zero-log policy to be true. ExpressVPN offers a useful kill switch feature, which prevents network data from leaking outside of its secure VPN tunnel in the event the VPN connection fails. ExpressVPN also offers support of bitcoin as a payment method, which adds an additional layer of privacy during checkout.

Protect your data using an airgap with LTO Tape: Read the Blog

Surfshark

  • Number of servers: 3,200+
  • Number of server locations: 65
  • Jurisdiction: British Virgin Islands

Surfshark’s network is smaller than some, but the VPN service makes up for it with the features and speeds it offers. The biggest benefit it offers is unlimited device support, meaning users don’t have to worry about how many devices they have on or connected. It also offers antimalware, ad-blocking, and tracker-blocking as part of its software. Surfshark has a solid range of app support, running on Mac, Windows, iOS, Android, Fire TV, and routers. Supplementary devices such as game consoles can be set up for Surfshark through DNS settings. Surfshark also offers three special modes designed for those who want to bypass restrictions and hide their online footprints. Camouflage Mode hides user’s VPN activity so the ISP doesn’t know they’re using a VPN. Multihop jumps the connection through multiple countries to hide any trail. Finally, NoBorders Mode “allows users to successfully use Surfshark in restrictive regions.

NordVPN

  • Number of IP addresses: 5,000
  • Number of servers: 5,200+ servers
  • Number of server locations: 62
  • Country/jurisdiction: Panama
  • 62 countries

NordVPN is one of the most established brands in the VPN market. It offers a large concurrent connection count, with six simultaneous connections through its network, where nearly all other providers offer five or fewer. NordVPN also offers a dedicated IP option, for those looking for a different level of VPN connection. They also offer a kill switch feature, which prevents network data from leaking outside of its secure VPN tunnel in the event the VPN connection fails. While NordVPN has had a spotless reputation for a long time, a recent report emerged that one of its rented servers was accessed without authorization back in 2018. Nord’s actions following the discovery included multiple security audits, a bug bounty program, and heavier investments in server security. The fact that the breach was limited in nature and involved no user-identifying information served to further prove that NordVPN keeps no logs of user activity. 

Looking for even more security? Find out how to set up a Zero Trust Network here.

IPVanish

  • Number of IP addresses: 40,000+
  • Number of servers: 1,300
  • Number of server locations: 60
  • Number of simultaneous connections: 10
  • Country/jurisdiction: US

A huge benefit that IPVanish offers its users is an easy-to-use platform, which is ideal for users who are interested in learning how to understand what a VPN does behind the scenes. Its multiplatform flexibility is also perfect for people focused on finding a Netflix-friendly VPN. A special feature of IPVanish is the VPN’s support of Kodi, the open-source media streaming app. The company garners praise for its latest increase from five to ten simultaneous connections. Similar to other VPNs on the list, IPVanish has a kill switch, which is a must for anyone serious about remaining anonymous online. 

Norton Secure VPN

  • Number of countries: 29
  • Number of servers: 1,500 (1,200 virtual)
  • Number of server locations: 200 in 73 cities
  • Country/jurisdiction: US

Norton has long been known for its excellence in security products, and now offers a VPN service. However, it is limited in its service offerings as it does not support P2P, Linux, routers, or set-top boxes. It does offer Netflix and streaming compatibility. Norton Secure VPN speeds are comparable to other mid-tier VPNs in the same segment. Norton Secure VPN is available on four platforms: Mac, iOS, Windows, and Android. It is one of the few VPN services to offer live 24/7 customer support and 60-day money- back guarantee.

FEATURED

How To Set Up A Zero-Trust Network

How to set up a zero-trust network

In the past, IT and cybersecurity professionals tackled their work with a strong focus on the network perimeter. It was assumed that everything within the network was trusted, while everything outside the network was a possible threat. Unfortunately, this bold method has not survived the test of time, and organizations now find themselves working in a threat landscape where it is possible that an attacker already has one foot in the door of their network. How did this come to be? Over time cybercriminals have gained entry through a compromised system, vulnerable wireless connection, stolen credentials, or other ways.

The best way to avoid a cyber-attack in this new sophisticated environment is by implementing a zero-trust network philosophy. In a zero-trust network, the only assumption that can be made is that no user or device is trusted until they have proved otherwise. With this new approach in mind, we can explore more about what a zero-trust network is and how you can implement one in your business.

Interested in knowing the top 10 ITAD tips for 2021? Read the blog.

Image courtesy of Cisco

What is a zero-trust network and why is it important?

A zero-trust network or sometimes referred to as zero-trust security is an IT security model that involves mandatory identity verification for every person and device trying to access resources on a private network. There is no single specific technology associated with this method, instead, it is an all-inclusive approach to network security that incorporates several different principles and technologies.

Normally, an IT network is secured with the castle-and-moat methodology; whereas it is hard to gain access from outside the network, but everyone inside the network is trusted. The challenge we currently face with this security model is that once a hacker has access to the network, they have free to do as they please with no roadblocks stopping them.

The original theory of zero-trust was conceived over a decade ago, however, the unforeseen events of this past year have propelled it to the top of enterprise security plans. Businesses experienced a mass influx of remote working due to the COVID-19 pandemic, meaning that organizations’ customary perimeter-based security models were fractured.  With the increase in remote working, an organization’s network is no longer defined as a single entity in one location. The network now exists everywhere, 24 hours a day. 

If businesses today decide to pass on the adoption of a zero-trust network, they risk a breach in one part of their network quickly spreading as malware or ransomware. There have been massive increases in the number of ransomware attacks in recent years. From hospitals to local government and major corporations; ransomware has caused large-scale outages across all sectors. Going forward, it appears that implementing a zero-trust network is the way to go. That’s why we put together a list of things you can do to set up a zero-trust network.

These were the top 5 cybersecurity trends from 2020, and what we have to look forward to this year.

Image courtesy of Varonis

Proper Network Segmentation

Proper network segmentation is the cornerstone of a zero-trust network. Systems and devices must be separated by the types of access they allow and the information that they process. Network segments can act as the trust boundaries that allow other security controls to enforce the zero-trust attitude.

Improve Identity and Access Management

A necessity for applying zero-trust security is a strong identity and access management foundation. Using multifactor authentication provides added assurance of identity and protects against theft of individual credentials. Identify who is attempting to connect to the network. Most organizations use one or more types of identity and access management tools to do this. Users or autonomous devices must prove who or what they are by using authentication methods. 

Least Privilege and Micro Segmentation

Least privilege applies to both networks and firewalls. After segmenting the network, cybersecurity teams must lock down access between networks to only traffic essential to business needs. If two or more remote offices do not need direct communication with each other, that access should not be granted. Once a zero-trust network positively identifies a user or their device, it must have controls in place to grant application, file, and service access to only what is needed by them. Depending on the software or machines being used, access control can be based on user identity, or incorporate some form of network segmentation in addition to user and device identification. This is known as micro segmentation. Micro segmentation is used to build highly secure subsets within a network where the user or device can connect and access only the resources and services it needs. Micro segmentation is great from a security standpoint because it significantly reduces negative effects on infrastructure if a compromise occurs. 

Add Application Inspection to the Firewall

Cybersecurity teams need to add application inspection technology to their existing firewalls, ensuring that traffic passing through a connection carries appropriate content. Contemporary firewalls go far beyond the simple rule-based inspection that they previously have. 

Record and Investigate Security Incidents

A great security system involves vision, and vision requires awareness. Cybersecurity teams can only do their job effectively if they have a complete view and awareness of security incidents collected from systems, devices, and applications across the organization. Using a security information and event management program provides analysts with a centralized view of the data they need.

Image courtesy of Cloudfare
FEATURED

Top 10 ITAD Tips of 2021

From a business perspective, one of the biggest takeaways from last year is how companies were forced to become flexible and adapt with the Covid-19 pandemic. From migrating to remote work for the foreseeable future, to more strictly managing budgets and cutting back. Some more experienced organizations took steps to update their information technology asset disposition (ITAD) strategies going forward. There are multiple factors that go into creating a successful ITAD strategy. Successful ITAD management requires a strict and well-defined process. Below are ten expert tips to take with you into a successful 2021.

1 – Do Your Homework

Multiple certifications are available to help companies identify which ITAD service providers have taken the time to create processes in accordance with local, state and federal laws. Having ITAD processes in a structured guidebook is important, but most would agree that the execution of the procedures is entirely different. A successful ITAD service comes down to the people following the process set in place. When selecting an ITAD partner, make sure you do your homework.

You can learn more about our ITAD processes here.

2 – Request a Chain of Custody 

Every ITAD process should cover several key areas including traceability, software, logistics and verification. Be sure to maintain a clear record of serial numbers on all equipment, physical location, purchase and sale price and the staff managing the equipment. The entire chain of custody should be recorded, as well as multiple verification audits ensuring data sanitization and certificates of data destruction are issued. 

Read more about how a secure chain of custody works.

3 – Create a Re-Marketing Strategy

Creating a re-marketing strategy can help ease the financial burden of managing the ITAD process. Donation, wholesale and business to consumer are the primary channels in the marketplace for IT assets. Re-marketing can greatly help pay the costs of managing ITAD operations.

4 – Maintain an Accurate List of Assets

Many organizations use their IT asset management software to create an early list of assets that need to be retired. Sometimes this initial list also becomes the master list used in their ITAD program. However, IT assets that are not on the network are not usually detected by the software. Common asset tracking identifiers used to classify inventory include make, model, serial number and asset tag.

5 – Choose a GDPR-Compliant Provider

Some of the biggest benefactors to emerge from the Covid-19 pandemic were cloud providers. However, selecting what cloud provider to use is critical. Find a cloud provider that allows users to access documents from a GDPR-compliant cloud-based server, keeping the documents within GDPR legislation. 

Learn More About How We Help Businesses Stay Compliant

6 – Avoid GDPR-Related Fines

Similar to the previous tip, it is important that data and documents are classified centrally, so employees can make legal and informed decisions as to what documents they can, or cannot, access on personal devices. Ensure GDPR policies are in place and adhered to for all staff, wherever they may be working. 

7 – Erase Data Off of Personal Assets

Hopefully in the near future, Covid-19 will no longer be a threat to businesses and regular life and work will resume. When that happens, it is wise to consider whether employees were using their personal devices while working from home. If so, all documents and data stored on personal devices must be erased accordingly. Put a policy in place for staff to sanitize their devices. This will help companies avoid being subjected to laws relating to data mismanagement or the possibility of sensitive corporate information remaining on personal devices.

Learn more about secure hard drive erasure.

8 – Ask the Right Questions

In the past, it was uncommon for organizations to practice strict selection processes and vetting for ITAD providers. Companies didn’t know which questions to ask and most were satisfied with simply hauling away their retired IT equipment. Now, most organizations issue a detailed report evaluating ITAD vendor capabilities and strengths. The reports generally include information regarding compliance, data security, sustainability and value recovery. 

9 – Use On-Site Data Destruction

Just one case of compromised data can be overwhelming for a company. Confirming security of all data stored assets is imperative. It is estimated that about 65 percent of businesses require data destruction while their assets are still in their custody. The increase in on-site data destruction services was foreseeable as it is one of the highest levels of security services in the industry. 

Learn more about our on-site data destruction services here.

10 – Increase Your Value Recovery

Even if the costs of partnering with an ITAD vendor weren’t in the budget, there are still ways you can increase your value recovery.

  • Don’t wait to resale. When it comes to value recovery of IT assets, timing is everything. Pay attention to new IT innovations combined with short refresh cycles. These are some reasons why IT assets can depreciate in value so quickly.
  • Take time to understand your ITAD vendor’s resale channels and strategies. A vendor who maintains active and varied resale channels is preferred. 
  • Know the vendor’s chain of custody. Each phase of moving IT equipment from your facility to an ITAD services center, and eventually to secondary market buyers should be considered.
FEATURED

SolarWinds Orion: The Biggest Hack of the Year

Federal agencies faced one of their worst nightmares this past week when they were informed of a massive compromise by foreign hackers within their network management software. An emergency directive from the Cybersecurity and Infrastructure Security Agency (CISA) instructed all agencies using SolarWinds products to review their networks and disconnect or power down the company’s Orion software. 

Orion has been used by the government for years and the software operates at the heart of some crucial federal systems. SolarWinds has been supplying agencies for some-time as well, developing tools to understand how their servers were operating, and later branching into network and infrastructure monitoring. Orion is the structure binding all of those things together. According to a preliminary search of the Federal Procurement Data System – Next Generation (FPDS-NG), at least 32 federal agencies bought SolarWinds Orion software since 2006.

Listed below are some of the agencies and departments within the government that contracts for SolarWinds Orion products have been awarded to. Even though all them bought SolarWinds Orion products, that doesn’t mean they were using them between March and June, when the vulnerability was introduced during updates. Agencies that have ongoing contracts for SolarWinds Orion products include the Army, DOE, FLETC, ICE, IRS, and VA. SolarWinds estimates that less than 18,000 users installed products with the vulnerability during that time.

  • Bureaus of Land Management, Ocean Energy Management, and Safety and Environmental Enforcement, as well as the National Park Service and Office of Policy, Budget, and Administration within the Department of the Interior
  • Air Force, Army, Defense Logistics Agency, Defense Threat Reduction Agency, and Navy within the Department of Defense
  • Department of Energy
  • Departmental Administration and Farm Service Agency within the U.S. Department of Agriculture
  • Federal Acquisition Service within the General Services Administration
  • FBI within the Department of Justice
  • Federal Highway Administration and Immediate Office of the Secretary within the Department of Transportation
  • Federal Law Enforcement Training Center, Transportation Security Administration, Immigration and Customs Enforcement, and Office of Procurement Operations within the Department of Homeland Security
  • Food and Drug Administration, National Institutes of Health, and Office of the Assistant Secretary for Administration within the Department of Health and Human Services
  • IRS and Office of the Comptroller of the Currency within the Department of the Treasury
  • NASA
  • National Oceanic and Atmospheric Administration within the Department of Commerce
  • National Science Foundation
  • Peace Corps
  • State Department
  • Department of Veterans Affairs

YOU CAN READ THE JOINT STATEMENT BY THE FEDERAL BUREAU OF INVESTIGATION (FBI), THE CYBERSECURITY AND INFRASTRUCTURE SECURITY AGENCY (CISA), AND THE OFFICE OF THE DIRECTOR OF NATIONAL INTELLIGENCE (ODNI) HERE.

How the Attack was Discovered

When Cyber security firm FireEye Inc. discovered that it was the victim of a malicious cyber-attack, the company’s investigators began trying to figure out exactly how attackers got past its secured defenses. They quickly found out,  they were not the only victims of the attack. Investigators uncovered a weakness in a product made by one of its software providers, SolarWinds Corp. After looking through 50,000 lines of source code, they were able to conclude there was a backdoor within SolarWinds. FireEye contacted SolarWinds and law enforcement immediately after the backdoor vulnerability was found.

Hackers, believed to be part of an elite Russian group, took advantage of the vulnerability to insert malware, which found its way into the systems of SolarWinds customers with software updates. So far, as many as 18,000 entities may have downloaded the malware. The hackers who attacked FireEye stole sensitive tools that the company uses to find vulnerabilities in clients’ computer networks. The investigation by FireEye discovered that the hack on itself was part of a global campaign by a highly complex attacker that also targeted government, consulting, technology, telecom and extractive entities in North America, Europe, Asia, and the Middle East.

The hackers that implemented the attack were sophisticated unlike any seen before. They took innovative steps to conceal their actions, operating from servers based in the same city as an employee they were pretending to be. The hackers were able to breach U.S. government entities by first attacking the SolarWinds IT provider. By compromising the software used by government entities and corporations to monitor their network, hackers were able to gain a position into their network and dig deeper all while appearing as legitimate traffic.

Read how Microsoft and US Cyber Command joined forces to stop a vicious malware attack earlier this year.

How Can the Attack Be Stopped?

Technology firms are stopping some of the hackers’ key infrastructure as the U.S. government works to control a hacking campaign that relies on software in technology from SolarWinds. FireEye is working with Microsoft and the domain registrar GoDaddy to take over one of the domains that attackers had used to send malicious code to its victims. The move is not a cure-all for stopping the cyber-attack, but it should help stem the surge of victims, which includes the departments of Treasury and Homeland Security.

 

According to FireEye, the seized domain, known as a “killswitch,” will affect new and previous infections of the malicious code coming from that particular domain. Depending on the IP address returned under certain conditions, the malware would terminate itself and prevent further execution. The “killswitch” will make it harder for the attackers to use the malware that they have already deployed. Although, FireEye warned that hackers still have other ways of keeping access to networks. With the sample of invasions FireEye has seen, the hacker moved quickly to establish additional persistent mechanisms to access to victim networks.

 

The FBI is investigating the compromise of SolarWinds’ software updates, which was linked with a Russian intelligence service. SolarWinds’ software is used throughout Fortune 500 companies, and in critical sectors such as electricity. The “killswitch” action highlights the power that major technology companies have to throw up roadblocks to well-resourced hackers. This is very similar to Microsoft teaming up with the US Cyber Command to disrupt a powerful Trickbot botnet in October.

FEATURED

5 Cyber Security Trends from 2020 and What We Can Look Forward to Next Year

Today’s cybersecurity landscape is changing a faster rate than we’ve ever experienced before. Hackers are inventing new ways to attack businesses and cybersecurity experts are relentlessly trying to find new ways to protect them. Cost businesses approximately $45 billion, cyber-attacks can be disastrous for businesses, causing adverse financial and non-financial effects. Cyber-attacks can also result in loss of sensitive data, never-ending lawsuits, and a smeared reputation. 

 

With cyber-attack rates on the rise, companies need to up their defenses. Businesses should take the time to brush up on cybersecurity trends for the upcoming year, as this information could help them prepare and avoid becoming another victim of a malicious attack. Given the importance of cyber security in the current world, we’ve gathered a list of the top trends seen in cybersecurity this year and what you can expect in 2021.

INCREASE IN SPENDING

 

It’s no secret that cybersecurity spending is on the rise. It has to be in order to keep up with rapidly changing technology landscape we live in. For example, in 2019 alone, the global cyber security spending was estimated to be around $103 billion, a 9.4% increase from 2018. This year the US government spent $17.4 billion on cybersecurity, a 5% increase from 2019. Even more alarming is the fact that cybercrime is projected to exceed $6 trillion annually by 2021 up from $3 trillion in 2015. The most significant factor driving this increase is the improved efficiency of cybercriminals. The dark web has become a booming black market where criminals can launch complex cyberattacks.  With lower barriers to entry and massive financial payoffs, we can expect cybercrime to grow well into the future.

 

Learn more about how Microsoft is teaming up with US National Security to defeat threatening malware bot.

COMPANIES CONTINUE TO LEARN

 

Demand for cybersecurity experts continued to surpass the supply in 2020. We don’t see this changing anytime soon either. Amidst this trend, security experts contend with considerably more threats than ever before. Currently, more than 4 million professionals in the cybersecurity field are being tasked with closing the skills gap. Since the cybersecurity learning curve won’t be slowing anytime soon, companies must come to grips with strategies that help stop the shortage of talent. Options include cross-training existing IT staff, recruiting professionals from other areas, or even setting the job qualifications at appropriate levels in order to attract more candidates. 

 

Most organizations are starting to realize that cybersecurity intelligence is a critical piece to growth Understanding the behavior of their attackers and their tendencies can help in anticipating and reacting quickly after an attack happens. A significant problem that also exists is the volume of data available from multiple sources. Add to this the fact that security and planning technologies typically do not mix well. In the future, expect continued emphasis on developing the next generation of cyber security professionals.

THE INFLUENCE OF MACHINE INTELLIGENCE DEVELOPS

 

Artificial Intelligence (AI) and Machine Learning (ML) are progressively becoming necessary for cybersecurity. Integrating AI with cybersecurity solutions can have positive outcomes, such as improving threat and malicious activity detection and supporting fast responses to cyber-attacks. The market for AI in cybersecurity is growing at a drastic pace. In 2019, the demand for AI in cybersecurity surpassed $8.8 billion, with the market is projected to grow to 38.2 billion by 2026. 

 

Find out how the US military is integrating AI and ML into keeping our country safe.

MORE SMALL BUSINESSES INVEST IN CYBER PROTECTION

 

When we think of a cyber-attack occurring, we tend to envision a multibillion-dollar conglomerate that easily has the funds to pay the ransom for data retrieval and boost its security the next time around. Surprisingly, 43% of cyber-attacks happen to small businesses, costing them an average of $200,000. Sadly, when small businesses fall victim to these attacks, 60% of them go out of business within six months.

 

Hackers go after small businesses because they know that they have poor or even no preventative measures in place. A large number of small businesses even think that they’re too small to be victims of cyber-attacks. Tech savvy small businesses are increasingly taking a preventative approach to cybersecurity. Understanding that like big organizations, they are targets for cybercrimes, and therefore adapting effective cybersecurity strategies. As a result, a number of small businesses are planning on increasing their spending on cybersecurity and investing in information security training.

 

We have the ultimate cure to the ransomware epidemic plaguing small business.

CYBER-ATTACKS INCREASE ON CRITICAL INFRASTRUCTURES

 

Utility companies and government agencies are extremely critical the economy because they offer support to millions of people across the nation. Critical infrastructure includes public transportation systems, power grids, and large-scale constructions. These government entities store massive amounts of personal data about their citizens. such as health records, residency, and even bank details. If this personal data is not well protected, it could fall in the wrong hands resulting in breaches that could be disastrous. This is also what makes them an excellent target for a cyber-attack. 

 

Unfortunately, the trend is anticipated to continue into 2021 and beyond because most public organizations are not adequately prepared to handle an attack. While governments may be ill prepared for cyber-attacks, hackers are busy preparing for them. 

 

Curious About the Future of all Internet Connected Devices? Read Our Blog here

WHAT CAN WE LOOK FORWARD TO IN 2021?

Going forward into a new year, it’s obvious that many elements are coming together to increase cyber risk for businesses. Industry and economic growth continue to push organizations to rapid digital transformation, accelerating the use of technologies and increasing exposure to many inherent security issues. The combination of fewer cyber security experts and an increase of cyber-crime are trends that will continue for some time to come. Businesses that investment in technologies, security, and cybersecurity talent can greatly reduce their risk of a cyber-attack and  increase the likelihood that cybercriminals will look elsewhere to manipulate a less prepared target.

FEATURED

4G on the Moon – NASA awards Nokia $14 Million

Cellular Service That’s Out of This World

As soon as 2024, we may be seeing humans revisit the moon. Except this time, we should be able to communicate with them in real time from a cellular device. Down here on Earth, the competition between telecom providers is as intense as ever. However, Nokia may have just taken one giant leap over its competitors, with the announcement of expanding into a new market, winning a $14.1 million contract from Nasa to put a 4G network on the moon.

Why put a communications network on the moon?

Now, you may be wondering, “why would we need a telecommunications network on the mood?” According to Nokia Labs researchers, installing a 4G network on the surface of Earth’s natural satellite will help show whether it’s possible to have human habitation on the moon. By adopting a super-compact, low-power, space-hardened, wireless 4G network, it will greatly increase the US space agency’s plan to establish a long-term human presence on the moon by 2030. Astronauts will begin carrying out detailed experiments and explorations which the agency hopes will help it develop its first human mission to Mars.

Nokia’s 4G LTE network, the predecessor to 5G, will deliver key communication capabilities for many different data transmission applications, including vital command and control functions, remote control of lunar rovers, real-time navigation and streaming of high definition video. These communication applications are all vital to long-term human presence on the lunar surface. The network is perfectly capable of supplying wireless connectivity for any activity that space travelers may need to carry out, enabling voice and video communications capabilities, telemetry and biometric data exchange, and deployment and control of robotic and sensor payloads.

Learn more about “radiation-hardened” IT equipment used by NASA in our blog.

How can Nokia pull this off?

When it comes to space travel and moon landings in the past, you always hear about how so much can go wrong. Look at Apollo 13 for instance. Granted, technology has vastly improved in the past half century, but it still seems like a large feat to install a network on the moon. The network Nokia plans to implement will be designed for the moon’s distinctive climate, with the ability to withstand extreme temperatures, radiation, and even vibrations created by rocket landings and launches. The moon’s 4G network will also use much smaller cells than those on Earth, having a smaller range and require less power.

Nokia is partnering with Intuitive Machines for this mission to integrate the network into their lunar lander and deliver it to the lunar surface. The network will self-configure upon deployment and establish the first LTE communications system on the Moon. Nokia’s network equipment will be installed remotely on the moon’s surface using a lunar hopper built by Intuitive Machines in late 2022.

According to Nokia, the lunar network involves an LTE Base Station with integrated Evolved Packet Core (EPC) functionalities, LTE User Equipment, RF antennas and high-reliability operations and maintenance (O&M) control software. The same LTE technologies that have met the world’s mobile data and voice demands for the last decade are fully capable of providing mission critical and state-of-the-art connectivity and communications capabilities for the future of space exploration. Nokia plans to supply commercial LTE products and provide technology to expand the commercialization of LTE, and to pursue space applications of LTE’s successor technology, 5G.

Why did Nokia win the contract to put a network on the moon?

An industry leader in end-to-end communication technologies for service provider and enterprise customers all over the world, Nokia develops and provides networks for airports, factories, industrial, first-responders, and the harshest mining operations on Earth. Their series of networks have far proven themselves reliable for automation, data collection and dependable communications. By installing its technologies in the most extreme environment known to man, Nokia will corroborate the solution’s performance and technology readiness, enhancing it for future space missions and human inhabiting.

FEATURED

Introducing the Apple M1 Chip

Over 35 years ago in 1984, Apple transformed personal technology with the introduction of the Macintosh personal computer. Today, Apple is a world leader in innovation with phones, tablets, computers, watches and even TV. Now it seems Apple has dived headfirst into another technological innovation that may change computing as we know it. Introducing the Apple M1 chip. Recently, Apple announced the most powerful chip it has ever created, and the first chip designed specifically for its Mac product line. Boasting industry-leading performance, powerful features, and incredible efficiency, the M1 chip is optimized for Mac systems in which small size and power efficiency are critically important.

The First System on a Chip

If you haven’t heard of this before, you’re not alone. System on a chip (SoC) is fairly new. Traditionally, Macs and PCs have used numerous chips for the CPU, I/O, security, and more. However, SoC combines all of these technologies into a single chip, resulting in greater performance and power efficiency. M1 is the first personal computer chip built using cutting-edge 5-nanometer process technology and is packed with an eyebrow raising 16 billion transistors. M1 also features a unified memory architecture that brings together high-bandwidth and low-latency memory into a custom package. This allows all of the technologies in the SoC to access the same data without copying it between multiple pools of memory, further improving performance and efficiency.

M1 Offers the World’s Best CPU Performance

Apple’s M1 chip includes an 8-core CPU consisting of four high-performance cores and four high-efficiency cores. They are the world’s fastest CPU cores in low-power silicon, giving photographers the ability to edit high-resolution photos with rapid speed and developers to build apps almost 3x faster than before. The four high-efficiency cores provide exceptional performance at a tenth of the power. Single handedly, these four cores can deliver a similar output as the current-generation, dual-core MacBook Air, but at much lower power. They are the most efficient way to run lightweight, everyday tasks like checking email and surfing the web, simultaneously maintaining battery life better than ever. When all eight of the cores work together, they can deliver the world’s best CPU performance per watt.

Wondering how to sell your inventory of used CPUs and processors? Let us help.

The World’s Sharpest Unified Graphics

M1 incorporates Apple’s most advanced GPU, benefiting from years of evaluating Mac applications, from ordinary apps to demanding workloads. The M1 is truly in a league of its own with industry-leading performance and incredible efficiency. Highlighting up to eight powerful cores, the GPU can easily handle very demanding tasks, from effortless playback of multiple 4K video streams to building intricate 3D scenes. Having 2.6 teraflops of throughput, M1 has the world’s fastest integrated graphics in a personal computer.

Bringing the Apple Neural Engine to the Mac

Significantly increasing the speed of machine learning (ML) tasks, the M1 chip brings the Apple Neural Engine to the Mac. Featuring Apple’s most advanced 16-core architecture capable of 11 trillion operations per second, the Neural Engine in M1 enables up to 15x faster machine learning performance. With ML accelerators in the CPU and a powerful GPU, the M1 chip is intended to excel at machine learning. Common tasks like video analysis, voice recognition, and image processing will have a level of performance never seen before on the Mac.

Upgrading your inventory of Macs or laptops? We buy those too.

M1 is Loaded with Innovative Technologies

The M1 chip is packed with several powerful custom technologies:

  • Apple’s most recent image signal processor (ISP) for higher quality video with better noise reduction, greater dynamic range, and improved auto white balance.
  • The modern Secure Enclave for best-in-class security.
  • A high-performance storage controller with AES encryption hardware for quicker and more secure SSD performance.
  • Low-power, highly efficient media encode and decode engines for great performance and prolonged battery life.
  • An Apple-designed Thunderbolt controller with support for USB 4, transfer speeds up to 40Gbps, and compatibility with more peripherals than ever.
FEATURED

The Best Way to Prepare for a Data Center Take Out and Decommissioning

Whether your organization plans on relocating, upgrading, or migrating to cloud, data center take outs and decommissioning is no easy feat. There are countless ways that something could go wrong if attempting such a daunting task on your own. Partnering with an IT equipment specialist that knows the ins and outs of data center infrastructure is the best way to go. Since 1965, our highly experienced team of equipment experts, project managers, IT asset professionals, and support staff have handled numerous successful data center projects in every major US market. From a single server rack to a warehouse sized data center consisting of thousands of IT assets, we can handle your data center needs. We have the technical and logistical capabilities for data center take outs and decommissions. We deal with IT assets of multiple sizes, ranging from a single rack to a data center with thousands of racks and other equipment. Regardless of the requirements you’re facing, we can design a complete end-to-end solution to fit your specific needs.

 

Learn more about the data center services we offer

 

But that’s enough about us. We wrote this article to help YOU. We put together a step by step guide on how to prepare your data center to be removed completely or simply retire the assets it holds. Like always, we are here to help every step of the way.

Make a Plan

Create a list of goals you wish to achieve with your take out or decommissioning project.  Make an outline of expected outcomes or milestones with expected times of completion. These will keep you on task and make sure you’re staying on course. Appoint a project manager to oversee the project from start to finish. Most importantly, ensure backup systems are working correctly so there is not a loss of data along the way.

 

Make a List

Be sure to make an itemized list of all hardware and software equipment that will be involved with the decommissioning project or data center take out. Make sure nothing is disregarded and check twice with a physical review. Once all of the equipment in your data center is itemized, build a complete inventory of assets including hardware items such as servers, racks, networking gear, firewalls, storage, routers, switches, and even HVAC equipment. Collect all software licenses and virtualization hardware involved and keep all software licenses associated with servers and networking equipment. 

 

Partner with an ITAD Vendor

Partnering with an experienced IT Asset Disposition (ITAD) vendor can save you a tremendous amount of time and stress. An ITAD vendor can help with the implementation plan listing roles, responsibilities, and activities to be performed within the project. Along with the previous steps mentioned above, they can assist in preparing tracking numbers for each asset earmarked for decommissioning, and cancel maintenance contracts for equipment needing to be retired. 

Learn more about our ITAD process

 

Get the Required Tools

Before you purchase or rent any tools or heavy machinery, it is best to make a list of the tools, materials, and labor hours you will need to complete this massive undertaking. Some examples of tools and materials that might be necessary include forklifts, hoists, device shredders, degaussers, pallets, packing foam, hand tools, labels, boxes, and crates. Calculate the number of man hours needed to get the job done. Try to be as specific as possible about what the job requires at each stage. If outside resources are needed, make sure to perform the necessary background and security checks ahead of time. After all, it is your data at stake here.

 

Always Think Data Security

When the time comes to start the data center decommissioning or take out project, review your equipment checklist, and verify al of your data has been backed up, before powering down and disconnecting any equipment. Be sure to tag and map cables for easier set up and transporting, record serial numbers, and tag all hardware assets. For any equipment that will be transported off-site, data erasure may be necessary if it will not be used anymore. When transporting data offsite, make sure a logistics plan is in place. A certified and experienced ITAD partner will most likely offer certificates of data destruction and chain of custody during the entire process. They may also advise you in erasing, degaussing, shredding, or preparing for recycling each piece of equipment as itemized.

Learn more about the importance of data security

 

Post Takeout and Decommission

Once the data center take out and decommission project is complete, the packing can start. Make sure you have a dedicated space for packing assets. If any equipment is allocated for reuse within the company, follow the appropriate handoff procedure. For assets intended for refurbishing or recycling, pack and label for the intended recipients. If not using an ITAD vendor, be sure to use IT asset management software to track all stages of the process.

FEATURED

Apple’s Bug Bounty Program : Hacker’s Getting Paid

How does one of the largest and most innovative companies in history prevent cyber attacks and data hacks? They hire hackers to hack them. That’s right, Apple pays up to $1 million to friendly hackers who can find and report vulnerabilities within their operating systems. Recently, Apple announced that it will open its Bug Bounty program to anyone to report bugs, not just hackers who have previously signed up and been approved. 

 

Apple’s head of security engineering Ivan Krstic says is that this is a major win not only for iOS hackers and jailbreakers, but also for users—and ultimately even for Apple. The new bug bounties directly compete with the secondary market for iOS flaws, which has been booming in the last few years. 

 

In 2015, liability broker Zerodium revealed that will pay $1 million for a chain of bugs that allowed hackers to break into the iPhone remotely. Ever since, the cost of bug bounties has soared. Zerodium’s highest payout is now $2 million, and Crowdfense offering up to $3 million.

So how do you become a bug bounty for Apple? We’ll break it down for you.

 

What is the Apple Security Bounty?

As part of Apple’s devotion to information security, the company is willing to compensate researchers who discover and share critical issues and the methods they used to find them. Apple make it a priority to fix these issues in order to best protect their customers against a similar attack. Apple offers public recognition for those who submit valid reports and will match donations of the bounty payment to qualifying charities.

See the Apple Security Bounty Terms and Conditions Here

Who is Eligible to be a Bug Bounty?

 

In order to qualify to be an Apple Bug Bounty, the vulnerability you discover must appear on the latest publicly available versions of iOS, iPadOS, macOS, tvOS, or watchOS with a standard configuration. The eligibility rules are intended to protect customers until an update is readily available. This also ensures that Apple can confirm reports and create necessary updates, and properly reward those doing original research. 

Apple Bug Bounties requirements:

  • Be the first party to report the issue to Apple Product Security.
  • Provide a clear report, which includes a working exploit. 
  • Not disclose the issue publicly before Apple releases the security advisory for the report. 

Issues that are unknown to Apple and are unique to designated developer betas and public betas, can earn a 50% bonus payment. 

Qualifying issues include:

  • Security issues introduced in certain designated developer beta or public beta releases, as noted in their release notes. Not all developer or public betas are eligible for this additional bonus.
  • Regressions of previously resolved issues, including those with published advisories, that have been reintroduced in certain designated developer beta or public beta release, as noted in their release notes.

How Does the Bounty Program Payout?

 

The amount paid for each bounty is decided by the level of access attained by the reported issue. For reference, a maximum payout amount is set for each category. The exact payment amounts are determined after Apple reviews the submission. 

Here is a complete list of example payouts for Apple’s Bounty Program

The purpose of the Apple Bug Bounty Program is to protect consumers through understanding both data exposures and the way they were utilized. In order to receive confirmation and payment from the program, a full detailed report must be submitted to Apple’s Security Team.  

 

According to the tech giant, a complete report includes:

  • A detailed description of the issues being reported.
  • Any prerequisites and steps to get the system to an impacted state.
  • A reasonably reliable exploit for the issue being reported.
  • Enough information for Apple to be able to reasonably reproduce the issue. 

 

Keep in mind that Apple is particularly interested in issues that:

  • Affect multiple platforms.
  • Impact the latest publicly available hardware and software.
  • Are unique to newly added features or code in designated developer betas or public betas.
  • Impact sensitive components.

Learn more about reporting bugs to Apple here

FEATURED

LTO Consortium – Roadmap to the Future

LTO – From Past to Present 

Linear Tape-Open or more commonly referred to as LTO, is a magnetic tape data storage solution first created in the late 1990s as an open standards substitute to the proprietary magnetic tape formats that were available at the time.  It didn’t take long for LTO tape to rule the super tape market and become the best-selling super tape format year after year. LTO is usually used with small and large computer systems, mainly for backup. The standard form-factor of LTO technology goes by the name Ultrium. The original version of LTO Ultrium was announced at the turn of the century and is capable of storing up to 100 GB of data in a cartridge. Miniscule in today’s standards, this was unheard of at the time. The most recent generation of LTO Ultrium is the eighth generation which was released in 2017. LTO 8 has storage capabilities of up 12 TB (30 TB at 2.5:1 compression rate).

The LTO Consortium is a group of companies that directs development and manages licensing and certification of the LTO media and mechanism manufacturers. The consortium consists of Hewlett Packard Enterprise, IBM, and Quantum. Although there are multiple vendors and tape manufacturers, they all must adhere to the standards defined by the LTO consortium.  

Need a way to sell older LTO tapes?

LTO Consortium – Roadmap to the Future

The LTO consortium disclosed a future strategy to further develop the tape technology out to a 12th generation of LTO. This happened almost immediately after the release of the recent LTO-8 specifications and the LTO8 drives from IBM. Presumably sometime in the 2020s, when LTO-12 is readily available, a single tape cartridge should have capabilities of storing approximately half a petabyte of data.

According to the LTO roadmap, the blueprint calls for doubling the capacity of cartridges with every ensuing generation. This is the same model the group has followed since it distributed the first LTO-1 drives in 2000. However, the compression rate of 2.5:1 is not likely to change in the near future. In fact, the compression rate hasn’t increased since LTO-6 in 2013.

Learn how you can pre-purchase the latest LTO9 tapes 

The Principles of How LTO Tape Works

LTO tape is made up of servo bands which act like guard rails for the read/write head. The bands provide compatibility and adjustment between different tape drives. The read/write head positions between two servo bands that surround the data band. 

The read-write head writes multiple data tracks at once in a single, end-to-end pass called a wrap. At the end of the tape, the process continues as reverse pass and the head shifts to access the next wrap. This process is done from the edge to the center, known as linear serpentine recording.

More recent LTO generations have an auto speed mechanism built-in, unlike older LTO tape generations that suffered the stop-and-go of the drive upon the flow of data changes. The built-in auto speed mechanism lowers the streaming speed if the data flow, allowing the drive to continue writing at a constant speed. To ensure that the data just written on the tape is identical to what it should be, a verify-after-write process is used, using a read head that the tape passes after a write head.

But what about data security? To reach an exceptional level of data security, LTO has several mechanisms in place. 

Due to several data reliability features including error-correcting code (ECC), LTO tape has an extremely low bit-error-rate that is lower than that of hard disks. With both LTO7 and LTO8 generations, the data reliability has a bit error rate (BER) of 1 x 10-19.  This signifies that the drive and media will have one single bit error in approximately 10 exabytes (EB) of data being stored. In other words, more than 800,000 LTO-8 tapes can be written without error. Even more so, LTO tape allows for an air gap between tapes and the network. Having this physical gap between storage and any malware and attacks provides an unparalleled level of security.

 

Learn more about air-gap data security here

FEATURED

The Role of Cryptocurrencies in the Age of Ransomware

Now more than ever, there has become an obvious connection between the rising ransomware era and the cryptocurrency boom. Believe it or not, cryptocurrency and ransomware have an extensive history with one another. They are so closely linked, that many have attributed the rise of cryptocurrency with a corresponding rise in ransomware attacks across the globe. There is no debating the fact that ransomware attacks are escalating at an alarming rate, but there is no solid evidence showing a direct correlation to cryptocurrency. Even though the majority of ransoms are paid in crypto, the transparency of the currency’s block chain makes it a terrible place to keep stolen money.

The link between cryptocurrency and ransomware attacks

There are two keyways that ransomware attacks rely on the cryptocurrency market. First, the majority of the ransoms paid during these attacks are usually in cryptocurrency. A perfect example is with the largest ransomware attack in history, the WannaCry ransomware attacks. Attackers demanded their victims to pay nearly $300 of Bitcoin (BTC) to release their captive data..

A second way that cryptocurrencies and ransomware attacks are linked is through what is called “ransomware as a service”. Plenty of cyber criminals offer “ransomware as a service,” essentially letting anyone hire a hacker via online marketplaces. How do you think they want payment for their services? Cryptocurrency.

Read more about the WannaCry ransomware attacks here

Show Me the Money

From an outsider’s perspective, it seems clear why hackers would require ransom payments in cryptocurrency. The cryptocurrency’s blockchain is based on privacy and encryption, offering the best alternative to hide stolen money. Well, think again. There is actually a different reason why ransomware attacks make use of cryptocurrencies. The efficiency of cryptocurrency block chain networks, rather than its concealment, is what really draws the cyber criminals in.

The value of cryptocurrency during a cyber-attack is really the transparency of crypto exchanges. A ransomware attacker can keep an eye on the public blockchain to see if his victims have paid their ransom and can automate the procedures needed to give their victim the stolen data back. 

On the other hand, the cryptocurrency market is possibly the worst place to keep the stolen funds. The transparent quality of the cryptocurrency blockchain means that the world can closely monitor the transactions of ransom money. This makes it tricky to switch the stolen funds into an alternative currency, where they can be tracked by law enforcement.

Read about the recent CSU college system ransomware attack here

Law and Order

Now just because the paid ransom for stolen data can be tracked in the blockchain doesn’t automatically mean that the hackers who committed the crime can be caught too. Due to the anonymity of cryptocurrency it is nearly impossible for law enforcement agencies to find the true identity of cybercriminals, However, there are always exceptions to the rule. 

Blockchain allows a transaction to be traced relating to a given bitcoin address, all the way back to its original transaction. This permits law enforcement access to the financial records required to trace the ransom payment, in a way that would never be possible with cash transactions.

Due to several recent and prominent ransomware attacks, authorities have called for the cryptocurrency market to be watched more closely. In order to do so, supervision will need to be executed in a very careful manner, not to deter from the attractiveness of anonymity of the currency. 

Protect Yourself Anyway You Can

The shortage of legislative control of the cryptocurrency market, mixed with the quick rise in ransomware attacks, indicates that individuals need to take it upon themselves to protect their data. Some organizations have taken extraordinary approaches such as hoarding Bitcoin in case they need to pay a ransom as part of a future attack. 

For the common man, protecting against ransomware attacks means covering your bases. You should double check that all of your cyber security software is up to date, subscribe to a secure cloud storage provider and backup your data regularly. Companies of all sizes should implement the 3-2-1 data backup strategy in the case of a ransomware attack. The 3-2-1 backup plan states that one should have at least three different copies of data, stored on at least 2 different types of media, with at least one copy offsite. It helps to also have a separate copy of your data stored via the air-gap method, preventing it from ever being stolen.

Learn More About Getting Your 3-2-1 Backup Plan in Place

FEATURED

TapeChat with Pat

At DTC, we value great relationships. Luckily for us, we have some of the best industry contacts out there when it comes to tape media storage & backup. Patrick Mayock, a Partner Development Manager at Hewlett Packard Enterprise (HPE) is one of those individuals. Pat has been with HPE for the last 7 years and prior to that has been in the data backup / storage industry for the last 30 years. Pat is our go to guy at HPE, a true source of support, and overall great colleague. For our TapeChat series Pat was our top choice. Pat’s resume is an extensive one that would impress anyone who see’s it. Pat started his data / media storage journey back in the early 90’s in the bay area. Fast forward to today Pat can be found in the greater Denver area with the great minds over at HPE. Pat knows his stuff so sit back and enjoy this little Q&A we setup for you guys. We hope you enjoy and without further adieu, we welcome you to our series, TapeChat (with Pat)!

Pat, thank you for taking the time to join us digitally for this online Q&A. We would like to start off by stating how thrilled we are to have you with us. You’re an industry veteran and we’re honored to have you involved in our online content.

Thanks for the invite.  I enjoy working with your crew and am always impressed by your innovative strategies to reach out to new prospects and educate existing customers on the growing role of LTO tape from SMB to the Data Center. 

Let’s jump right into it! For the sake of starting things out on a fun note, what is the craziest story or experience you have had or know of involving the LTO / Tape industry? Maybe a fun fact that most are unaware of, or something you would typically tell friends and family… Anything that stands out…

I’ve worked with a few tape library companies over the years and before that I sold the original 9 track ½ inch tape drives.  Those were monsters, but you would laugh how little data they stored on a reel of tape. One of the most memorable projects I worked on was in the Bay Area, at Oracle headquarters.  They had the idea to migrate from reel to reel tape drives with a plan to replace them with compact, rack mounted, ‘robotic’ tape libraries.  At the end, they replaced those library type shelves, storing hundreds of reels of tape with 32 tape libraries in their computer cabinets.  Each tape library had room for 40 tape slots and four 5 ¼ full high tape drives.  The contrast was impressive.  To restore data, they went from IT staffers physically moving tape media, in ‘sneaker mode’ to having software locate where the data was stored, grab and load the tape automatically in the tape library and start reading data.   Ok, maybe too much of a tape story, but as a young sales rep at the time it was one that I’ll never forget. 

With someone like yourself who has been doing this for such a long time, what industry advancements and releases still get you excited to this day? What is Pat looking forward to right now in the LTO Tape world?

I’m lucky.  We used to have five or more tape technologies all fighting for their place in the data protection equation, each from a different vendor. Now, Ultrium LTO tape has a majority of the market and is supported by a coalition of multiple technology vendors working together to advance the design. Some work in the physical tape media, some on the read/write heads, and some on the tape drive itself.  The business has become more predictable and more reliable.  About every two years the consortium releases the next level of LTO tape technology.  We will see LTO-9 technology begin public announcements by the end of 2020. And the thirst for higher storage capacity and higher performance in the same physical space, this is what keeps me more than optimistic about the future.

When our sales team is making calls and asks a business if they are still backing up to LTO Tape, that question is always met with such an unappreciated / outdated response, in some cases we receive a response of laughter with something along the lines of “people still use tape” as a response. Why do you think LTO as a backup option is getting this type of response? What is it specifically about the technology that makes businesses feel as if LTO Tape is a way of the past…

As a Tape Guy, I hear that question a lot.  The reality in the market is that some industries are generating so much data that they have to increase their dependence on tape based solutions as part of their storage hierarchy. It starts with just the cost comparison of data on a single disk drive versus that same amount of data on a LTO tape cartridge. LTO tape wins. But the real impact is some much bigger than just that.  Think about the really large data center facilities.  The bigger considerations are for instance, for a given amount of data (a lot) what solution can fit the most data in to a cabinet size solution.  Physical floor space in the data center is at a premium.  Tape wins. Then consider the cost of having that data accessible.  A rack of disk drives consume tons more energy that a tape library. Tape wins again. Then consider the cooling cost that go along with all those disk drives spinning platters.  Tape wins, creating a greener solution that is more cost effective. At HPE and available from DTC, we have white papers and presentations on just this topic of cost savings.   In summary, if a company is not looking at or using LTO tape, then their data retention, data protection and data archiving needs are just not yet at the breaking point. 

There seems to be an emergence of the Disk / Hard Drive backup option being utilized by so many businesses. Do you feel like LTO Tape will ever be looked at with the same level of respect or appreciation by those same businesses?

If you are talking about solid state disk for high access, and dedicated disk drive solutions for backup – sure that works.  But at some point you need multiple copies at multiple locations to protect your investment.  The downside of most disk only solutions is that all the data is accessible across the network.  Now days, Ransomware and CyberSecurity are part of the biggest threats to corporations, government agencies and even mom and pop SMBs.  The unique advantage of adding LTO tape based tape libraries is that the data is NOT easily tapped into because the physical media in not in the tape drive.  Again, HPE has very detailed white papers and presentations on this Air Gap principle, all available from DTC. 

LTO Tape vs Hard Drive seems to be the big two in terms of the data / backup realm, as an insider to this topic, where do you see this battle going in the far future?

It’s less of a battle and more of a plan to ‘divide the work load and let’s work together’.  In most environments, tape and disk work side by side with applications selecting where the data is kept. However, there are physical limitations on how much space is available on a spinning platter or set of platters, and this will dramatically slow down the growth of their capacity within a given form factor. With LTO tape technology, the physical areal footprint is so much bigger, because of the thousands of feet of tape within each tape cartridge. At LTO-8 we have 960 meters of tape to write on. Even at a half inch wide, that’s a lot of space for data. Both disk and tape technologies will improve how much data they can fit on their media, (areal density) but LTO tape just has the advantage of so much space to work with. LTO tape will continue to follow the future roadmap which is already spec’d out to LTO-12.  

With so many years in this industry, what has been the highlight of your career?

The technology has always impressed me, learning and talking about the details of a particular technical design advantage. Then, being able to work with a wide range of IT specialists and learning about their business and what they actually do with the data.  But when I look back, on the biggest highlights,  I remember all the great people that I have worked with side by side to solve customer’s storage and data protection problems.  Sometimes we won, sometimes we didn’t.  I will never forget working to do our best for the deal. 

What tech advancements do you hope to see rolled out that would be a game changer for data storage as a whole?

The data storage evolution is driven by the creation of more data, every day.  When one technology fails to keep pace with the growth, another one steps up to the challenge.  Like I have said, LTO tape has a pretty solid path forward for easily 6 more years of breakthrough advancements. In 6 years, I’m sure there will be some new technology working to knock out LTO, some new technology that today is just an idea. 

We see more and more companies getting hit every day with ransomware / data theft due to hackers, what are your thoughts on this and where do you see things going with this. Will we ever reach a point where this will start to level off or become less common?

Ransomware and cyber security are the hot topics keeping IT Directors and business owners up at night. It is a criminal activity that is highly lucrative. Criminals will continue to attempt to steal data, block access and hold companies for ransom wherever they can.  But they prefer easy targets. As I mentioned earlier, Tape Solutions offer one key advantage in this battle: if the data isn’t live on the network, the hacker has to work harder. This is a critical step to protect your data. 

For more information on Pat, data backup / storage, + more follow Pat on Twitter:

FEATURED

DTC – A True Partnership

For Over Half of a Century We’ve Been Committed to Serving IT Departments and Saving IT Budgets 

 

Our Story

In 1965, we opened our doors for business with the idea to transform the IT equipment industry through technology, transparency, standards, and processes. We planted our roots as a round reel tape company in Downey, CA. As a family owned and operated business over the past 50 years, we have sprouted into one of the most trustworthy, reliable, and authoritative organizations in the industry. 

From disk pack tape storage and round reel tape to hard drives, networked storage, tape libraries, and cloud backup systems; our business and partnerships continue to prosper and grow with the constantly innovative IT industry. DTC proudly works with all organizations, letting our reputation speak for itself.

DTC’s 3 Point Message is Simple:

 

  • Our goal is to reach 100% Recyclability of old storage media and IT assets.

 

Electronics recycling is our bread and butter. We’ve been both saving the environment and companies money, by setting the standard for secure handling and re purposing of used and obsolete electronics. Recycling of electronics and IT equipment is an essential part of a company’s waste management strategy. If you are looking for a safe and secure way of electronics recycling, then you should consider our proven services. We specialize in ethical disposal and reprocessing of used and obsolete electronics and computer equipment. We can help accomplish legal and conservational goals as a responsible organization. Let us be the solution to your problem and help your organization stay socially responsible. 

 

Learn more about recycling your old IT assets

 

  • Our pledge since day one has been to keep your data safe.

 

Data security is main concern for IT departments in any organization, and rightly so. Many of our partners demand that their data is handled appropriately and destroyed according to both government and industry standards. DTC provides honest and secure data destruction services which include physical destruction with a mobile shredder and secure data erasure methods like degaussing. All of our destruction services are effective, auditable, and certified. Ship storage assets to our secured facility or simply ask for the mobile data destroyer to be deployed on site. With over 50 years of service, we’ve never had one data leak. Now that’s experience you can trust!

Learn more about DTC data security

 

  • Our process will help you save time and money.

 

Our IT asset disposition (ITAD) process will help your organization recoup dollars from your surplus, used IT Assets and free up storage space at your facility. Our equipment buyback program is dedicated to purchasing all types of surplus and used data storage and IT equipment. We use the highest standards to ensure you get the greatest return your initial IT investment. With the current pace of hardware evolution, most companies are upgrading their systems every two years. This leads to a lot of surplus IT equipment. DTC has the experience and resources to get you the most for your old IT assets.

Get the most return on your IT investment 

The Value We Provide

DTC’s diverse knowledge-base and experiences, allow our partners to utilize our purchasing and sales personnel as a valued resource for questions, research, and answers. Our vast database and the contact list of customers, resellers, recyclers, suppliers, and industry partners allows us to excellent pricing when sourcing your IT Equipment. Don’t believe us? Let us know what you need, and we will find it for you. 

How we can help you?

Here is brief list of services we provide:

 

Ready to work with a trusted partner? Contact Us Today



FEATURED

The TikTok Controversy: How Much Does Big Tech Care About Your Data and its Privacy?

If you have a teenager in your house, you’ve probably encountered them making weird dance videos in front of their phone’s camera. Welcome to the TikTok movement that’s taking over our nation’s youth. TikTok is a popular social media video sharing app that continues to make headlines due to cybersecurity concerns. Recently, the U.S. military banned its use on government phones following a warning from the DoD about potential personal information risk. TikTok has now verified that it patched multiple vulnerabilities that exposed user data. In order to better understand TikTok’s true impact on data and data privacy, we’ve compiled some of the details regarding the information TikTok gathers, sends, and stores.

What is TikTok?

TikTok is a video sharing application that allows users to create short, fifteen-second videos on their phones and post the content to a public platform. Videos can be enriched with music and visual elements, such as filters and stickers. By having a young adolescent demographic, along with the content that is created and shared on the platform, have put the app’s privacy features in the limelight as of late. Even more so, questions the location of TikTok data storage and access have raised red flags.

You can review TikTok’s privacy statement for yourself here.

TikTok Security Concerns

Even though TikTok allows users to control who can see their content, the app does ask for a number of consents on your device. Most noteworthy, it accesses your location and device information. However, there’s no evidence to support the theory of malicious activity or that TikTok is violating their privacy policy, it is still advised to practice caution with the content that’s both created and posted.

The biggest concern surrounding the TikTok application is where user information is stored and who has access to it. According the TikTok website, “We store all US user data in the United States, with backup redundancy in Singapore. Our data centers are located entirely outside of China, and none of our data is subject to Chinese law.” “The personal data that we collect from you will be transferred to, and stored at, a destination outside of the European Economic Area (“EEA”).” There is no other specific information regarding where user data is stored.

Recently, TikTok published a Transparency Report which lists “legal requests for user information”, “government requests for content removal”, and “copyrighted content take-down notices”. The “Legal Requests for User Information” shows that India, the United States, and Japan are the top three countries where user information was requested. The United States was the number one country with fulfilled request (86%) and number of accounts specified in the requests (255). Oddly enough, China is not listed as having received any requests for user information. 

What Kind of Data is TikTok Tracking?

Below are some of the consents TikTok requires on Android and iOS devices after installation of the app is completed. While some of the permissions are to be expected, these are all consistent with TikTok’s written privacy policy. When viewing all that TikTok gathers from its users, it can be alarming. In short, the app allows TikTok to:

  • Access the camera (and take pictures/video), the microphone (and record sound), the device’s WIFI connection, and the full list of contacts on your device.
  • Determine if the internet is available and access it if it is.
  • Keep the device turned on and automatically start itself.
  • Secure detailed information on the user’s location using GPS.
  • Read and write to the device’s storage, install/remove shortcuts, and access the flashlight (turn it off and on).

You read that right, TikTok has full access to your audio, video, and list of contacts in your phone. The geo location tracking via GPS is somewhat surprising though, especially since TikTok videos don’t display location information. So why collect that information? If you operate and Android device, TikTok has the capability of accessing other apps running at the same time, which can give the app access to data in another app such as a banking or password storage app. 

Why is TikTok Banned by the US Military?

In December 2019, the US military started instructing soldiers to stop using TikToK on all government-owned phones. This TikTok policy reversal came just shortly after the release of a Dec. 16 Defense Department Cyber Awareness Message classifying TikTok as having potential security risks associated with its use. As the US military cannot prevent government personnel from accessing TiKTok on their personal phones, the leaders recommended that service members use caution if unfamiliar text messages are received.

In fact, this was not the first time that the Defense Department had been required to encourage service members to remove a popular app from their phones. In 2016, the Defense Department banned the augmented-reality game, Pokémon Go, from US military owned smartphones. However, this case was a bit different as military officials alluded to concerns over productivity and the potential distractions it could cause. The concerns over TikTok are focused on cybersecurity and spying by the Chinese government.

In the past, the DoD has put out more general social media guidelines, advising personnel to proceed with caution when using any social platform. And all DoD personnel are required to take annual cyber awareness training that covers the threats that social media can pose.

FEATURED

DTC Printer Services to the Rescue

At DTC Computer Supplies, we recognize the importance of quick and thorough office equipment repairs for your business. Downtime costs you productivity and money, so we strive to fix it right in a timely manner. We have a team of expert field engineers waiting to help you! Our engineers can usually diagnose your issue over the phone and resolve your issues on the first trip to your site. As a family-owned business since 1965, we know what it means to have the job done right the first time. We pride ourselves on quality work and customer service, meaning you can dial directly to a real person, instead of trying to find your way through a phone maze. Let us show you why we’ve been a local favorite and a trusted industry leader for over 50 years. 

Introducing DTC’s Total-Care© Laser Printer Maintenance Program…

The printer is that one piece of office equipment that you really don’t appreciate, until it doesn’t work. But with so many brands, models, and parts on the market; is it really worth trying to spearhead your printer repair yourself? Most likely you will spend more time trying to troubleshoot the error and waste more money buying the wrong parts than you need to. It’s wise to just leave this job up to the experts. At DTC, we recognize that not every business is the same. That’s why we’ve created three different tiers to our TotalCare© Maintenance Program. This give you the freedom to choose which program is right for your business. No matter what your printer issue is, we’ve got you covered. We also only use high-quality parts that meet or exceed OEM specifications ensuring you printer is back up and running FAST!

3 Levels to Choose From – Find the one that fits your business!

 

Level 1 (TotalCare© Silver Package)

No Trip charge (20 mile radius)
Discounted labor
15% discount on parts
Free yearly cleanings
100% guarantee
1-4 printers
8 hr. time response

 

Level 2 (TotalCare© Gold Package)

No trip Charge
Free labor
Free parts included: Pickup rollers, transfer rollers, feed rollers
25% discount on all other parts
Free yearly cleanings
5-15 printers
6 hr. time response
100% guarantee

 

Level 3 (TotalCare© Platinum Package)

No trip Charge
Free Labor and Maintenance
all consumable parts included*
Free yearly cleanings
15+ printers
4 hr. time response
100% guarantee

Local Printer Service Rates

Labor Rates

In Shop Rates Remote Rates:

$45 if we can fix it in 30 minutes or less, otherwise $75 per hour

On Site Rates:

$75 per hour (minimum of one hour).

Travel Fees:

0-10 Miles = Free of charge

11-25 Miles = $25 Flat fee

26-50 plus Miles = $50 plus $1.50 per mile over 50 miles

DTC Premium Toner and Ink

Arguably, the most common printer problem you’ll encounter is running out of out of toner. Unfortunately, you usually don’t realize this until it’s too late. It’s an easy fix by simply replacing the toner cartridge in the printer. The toner itself is a bit more complicated.

Why Choose DTC Premium Toner Cartridges over the Competition?

  • DTC Premium Toner Cartridges can save you up to 25-50% over OEM Toner.
  • Our Compatible Toner Cartridges are NOT Re-Manufactured. We use High-Quality OEM-Grade Components.
  • Our Toner Cartridges are 100% ISO 9001, ISO 14001, and STMC compliant Factories and Quality-Control Processes.
  • Our Toner Cartridges come with a 1-Year Unconditional Guarantee that they will meet or exceed OEM specifications. All DTC Premium Toner Cartridges go through a rigorous and extensive inspection.
  • We pride ourselves with having less than 1% defect rate.

DTC also provides FREE Parts and Labor with purchase of toner. With your first purchase of toner we will deliver to your office and provide a free cleaning of the printer and install the cartridge. No contracts and cancel at any time. It doesn’t get much better than that!

With DTC replacement toner and ink cartridges, you have options:

DTC Laser Toner

DTC Laser Toner is ideal for high volume printing needs, best suited for small-medium businesses and schools that print a lot of text heavy documents and color prints. 

  • Save up to 60% over other OEM Toners.
  • 1-Year Unconditional Guarantee
  • Typically have the highest page yields of up to 20,000 pages.

OEM Laser Toner

DTC has superb quality OEM toner from all major brands including: HP, Xerox, Brother, and Lexmark.

  • Designed to meet the exact specifications your printer requires.
  • Guarantees complete color accuracy for color-matching.
  • Full manufacturer warranty.

Ink Cartridges

DTC ink cartridges are ideal for printing a small volume on a regular basis and quality photos.  

  • Can replace cartridges individually in cyan, magenta, yellow, and black.
  • Standard ink cartridge can print 200-500 pages.
  • Can be easier and cleaner to replace than laser cartridges.

Call DTC Computer Supplies today @ 1-800-700-7683 or email us @ contact@dtc1.com.

FEATURED

3-2-1 Backup Rule

What is the 3-2-1 Backup Rule?

 

The 3-2-1 backup rule is a concept made famous by photographer Peter Krogh. He basically said there are two types of people: those who have already had a storage failure and those who will have one in the future. Its inevitable. The 3-2-1 backup rule helps to answer two important questions: how many backup files should I have and where should I store them?

The 3-2-1 backup rule goes as follows:

  • Have at least three copies of your data.
  • Store the copies on two different media.
  • Keep one backup copy offsite.

Need help with building your data backup strategy?

  1. Create at least THREE different copies of your data

Yes, I said three copies. That means that in addition to your primary data, you should also have at least two more backups that you can rely on if needed. But why isn’t one backup sufficient you ask? Think about keeping your original data on storage device A and its backup is on storage device B. Both storage devices have the same characteristics, and they have no common failure causes. If device A has a probability of failure that’s 1/100 (and the same is true for device B), then the probability of failure of both devices at the same time is 1/10,000.

So with THREE copies of data, if you have your primary data (device A) and two backups of it (device B and device C), and if all devices have the same characteristics and no common failure causes, then the probability of failure of all three devices at the same time will be 1/1,000,000 chance of losing all of your data. That’s much better than having only one copy and a 1/100 chance of losing it all, wouldn’t you say? Creating more than two copies of data also avoids a situation where the primary copy and its backup copy are stored in the same physical location, in the event of a natural disaster.

  1. Store your data on at least TWO different types of media

Now in the last scenario above we assumed that there were no common failure causes for all of the devices that contain your precious data. Clearly, this requirement is much harder to fulfill if your primary data and its backup are located in the same place. Disks from the same RAID aren’t typically independent. Even more so, it is not uncommon to experience failure of one or more disks from the same storage compartment around the same time.

This is where the #2 comes in 3-2-1 rule. It is recommended that you keep copies of your data on at least TWO different storage types. For example, internal hard disk drives AND removable storage media such as tapes, external hard drives, USB drives, od SD-cards. It is even possible to keep data on two internal hard disk drives in different storage locations.

 

Learn more about purchasing tape media to expand your data storage strategy 

  1. Store at least ONE of these copies offsite

Believe it or not, physical separation between data copies is crucial. It’s bad idea to keep your external storage device in the same room as your primary storage device. Just ask the numerous companies that are located in the path of a tornado or in a flood zone. Or what would you do if your business caught fire? If you work for a smaller company with only one location, storing your backups to the cloud would be a smart alternative. Tapes that are stored at an offsite location are also popular among companies of all sizes.

 

Every system administrator should have a backup. This principle works for any virtual environment; regardless of the system you are running, backup is king!

FEATURED

Apple iPad Mini GIVEAWAY !!!

It’s Giveaway time! DTC Computer Supplies is giving away a brand new Apple iPad Mini to one of our lucky followers. It’s easy to enter for your chance to win. All you have to do to qualify is:

  1. Like / Follow us on Instagram, Facebook, Twitter or LinkedIn.
  2. Re-Post the ad onto your social media platform
  3. Use the Hashtag #DTCiPad in the post caption.

We will be choosing the lucky winner Friday, August 21st @ 12PM PST. Good luck, spread the word,  and thanks for all your support!

Like / Follow DTC Computer Supplies here:

INSTAGRAM: https://www.instagram.com/dtccomputersupplies/

TWITTER: https://twitter.com/DTCcompsupplies

FACEBOOK: https://www.facebook.com/DTCcomputersupplies/

LINKEDIN: https://www.linkedin.com/company/dtccomputersupplies/

 

Contest Rules:

  • Ad must remain posted onto your social media for the duration of the contest
  • Remain following DTC on social media for the duration of the contest
FEATURED

3 Things Star Wars Taught Us About Data Storage

A long time ago in a galaxy far, far away, a farm boy on a desert planet joined an uprising while saving a princess from a dark lord. Just like that, one of the most popular cinematic stories of all time was created. What began with one film in 1977 rapidly developed into one of the most successful media empires in existence. Now, more than four decades after it was introduced to audiences worldwide, Star Wars remains a global pop culture sensation. 

While doing my initial research for this article I came across quite an amazing factoid: George Lucas is truly a data backup and cloud storage buff! Knowing this bit of information, I began to see it throughout the Star Wars narrative. In the wake of this newfound knowledge and inspiration, I’d urge you to weigh the following to make sure your organization doesn’t endure the same struggles that Darth and the Skywalkers have over the course of their adventures.

Have a Data Security Plan in Place

It is pretty common knowledge that a decent data backup strategy starts with a strong approach to data security. With a data security plan in place, an organization can considerably reduce the chances of relying solely on backups. Sadly, the most minimal types of security appeared to be neglected throughout the Star Wars trilogy.

The first eight installments of the trilogy are rampant with data security concerns, but possibly the most noticeable happens when Obi-Wan looks into the origins of a Kamino Saberdart. While looking for the planet Kamino in the Jedi Archives, he discovers a blank space. Yoda concludes to Obi-Wan that the planet has likely been deleted from the archives. 

Some theories state that Jedi training was a specific type of password protection since the Force is required to manipulate the Jedi Archives. This wouldn’t be very wise considering there were numerous of trained Jedi in the galaxy, and their sworn enemies were force users. A great current day example would be both Google and Amazon offices sharing the same keycards. Not exactly infosec friendly to say the least! The Jedi had weak passwords with absolutely no permissions management.

In an effort to prevent cyber criminals from potentially accessing your company’s vital data, it is good practice to perform regular reviews of your data security strategies. Double check that there are no glaring holes in your data security strategy, habitually change passwords, use two factor authentication, and ALWAYS use encryption. 

Have a Backup Plan in Place

Having a data backup plan in place is a key complement to any security strategy. As the Jedi learned, even when your security is set up perfectly, disaster can strike. Inadequate security management on both sides led to the destruction of 6 planets, the mutilation of 2 others, and the obliteration of a handful of super weapons.

The best approach if such a plan is referred to as a 3-2-1 backup strategy. This means for every piece of data, you have the data itself, a backup copy on site (an external hard drive of some sort), and a final copy in the cloud. Now, I’ve come across several question regarding the use of a specific data storage medium used in the trilogy. So why does the Star Wars universe use such a basic technology as data-tapes? Simply put, backups are better on tape. The fewer moving parts an object has, the more robust it is. Additionally, the long-term life span of electronic memory is a dilemma, compared to that of tape. When you make a backup of something, you want it to be able to survive a long time. 

We first see the blueprints for the super weapon when Count Dooku, decides that they would be in safe with Darth Sidious on Scarif. By the time Jynn Erso hears of the Stardust version of the plans for the Death Star, it appears that Scarif is the only place in the Galaxy where you could find a copy of the plans. In a way, the plans on Scarif served as the Empire’s cloud storage. All the Death Star needed was an external hard drive following it through space with an additional copy of the plans!

If you only have one backup, it’s better than nothing. Not much better, but it is. Make certain that you’re using a 3-2-1 approach for a better backup strategy. Also consider a data backups allocated to different geographic regions in case of a natural disaster. Tape is great for this!

Have a Version Control Plan in Place

What good is to have backups if you don’t have the right information backed up? If we came away with one thing from the plans used to defeat the first Death Star, its that the Empire didn’t manage their version control. The backups that they had were not up to date. The focus lens for the superlaser is equatorial when the Death Star’s superlaser was actually on the northern hemisphere.

To operate a reliable backup solution, it needs to run on a consistent basis. Sometimes daily or even more frequently depending on the situation and data on hand. Whatever backup strategy the Death Star was using, it had clearly gone askew. 

Version History is also a critical piece to the backup puzzle. Version History lets users keep multiple versions of a file over extended periods of time, sometimes forever. Think if the Empire had set up a Version History. They could have reverted to the pre-final plans to destroy the Death Star.

Whether you manage a small business or a universal enterprise, your data (assuming its properly secured and backed up) can mean the difference between domination and having your battle station blown into a million pieces across the dark void of space. Data security and backup doesn’t need to be a battle. Create a plan that works for you, ensure your data is secured, and verify every so often that it’s up to date with the most recent versions. May The Data Be With You.

FEATURED

Industries We Serve

DTC serves a wide range of industries from healthcare, finance, education, and even small business. In today’s rapidly evolving and data focused world, every trade has a need to keep its data safe and its systems running.  Our pledge since day one has been to keep your data safe, no matter what industry you’re in. 

 

MAKING AN IMPACT IN EVERY INDUSTRY

At DTC, we work behind the scenes to solve your IT equipment challenges, improve data security, and reduce environmental impact of recycled assets. Whether your organization requires extensive data center services, IT asset disposition, e-waste recycling, on-site data destruction, or just general maintenance and support; we’ve got you covered. For over 50 years, we’ve made a name for ourselves as one of the most trusted brands in our industry, and we want to help you do the same. See below how we can help your industry.

HEALTHCARE

With extensive databases of medical records containing vital information, data maintenance, storage, and security is an essential part of healthcare. Regardless of how heavily healthcare institutions depend on computers and other electronic devices, not many organizations have a dedicated IT department with the ability to handle the vast amount of equipment requirements. Our IT equipment specialists can help with your organizations specific needs. We can flawlessly help save your healthcare institute’s budget, while you help save lives.

A plan for guarding against ransomware in the healthcare industry

Finance

Another data breach is the last thing any financial institution needs. Most financial organizations have highly developed IT professionals running their data infrastructures, but upgrades and disposal can become overwhelming. Industry best practices state that companies need to keep their data for several decades and should migrate the data to the latest software version of their backup solution. That’s where we come into play. Since 1965, we have been helping businesses, both large and small, keep their data secure while increasing ROI on used IT equipment.

Get Help With Securing Your Financial Data

 

Government

Today more than ever, government agencies depend on computers and electronics in order to effectively operate within their respective borders. Most governments, whether city, county, state or nationwide, have an extremely long holding period for sensitive, proprietary, or administration-critical data. Even though data tape is the most reliable medium for long-term data storage, they still need to be updated over time. More importantly, we are aware of the importance of data security when it comes to sensitive information on your used equipment.

 

Education

The recent COVID-19 pandemic has given us all a firsthand look at how classrooms across the country are integrating 1:1 student to computer learning. Now more than ever, IT equipment has become heavily integrated into school settings. Computer learning means a vast increase in information and data creation; requiring substantial data back up and security measures to protect it. Regardless of how heavily learning institutions depend on computers and other electronic devices, not many school districts have a dedicated IT department with the ability to handle the vast amount of data back up and equipment requirements.

Learn More About Saving IT Budgets in the Education Field

Energy and Utilities

Companies in the energy and utility industries are continuously being asked to find ways to reduce costs and improve efficiency. Whether it’s the power generation and alternative energy services, most companies have a very busy IT department. When the computers, servers, and data storage centers need to be upgraded and replaced; it can be exhausting. The right IT asset disposal (ITAD) partner can help get the most value from old IT equipment; resulting in maximum return on initial investment.

Learn More About ITAD and How it can Help You

Small and Mid-size Business

Entrepreneurship, innovation, and job creation are not only the foundation of a healthy economy but the backbone of America. Entrepreneurs embody the strength and character that help make our country great and we’d like to keep it that way. As a family-owned business since 1965, we understand the struggles and sacrifices of being a business owner. Small businesses and owners may not always have the budget for the latest upgrades in IT equipment or even know what to do with their older equipment. We can help with upgrades and disposal of your data storage tapes and other IT assets when needed. 

OUR GOAL IS TO BE YOUR MOST VALUED PARTNER

DTC’s IT equipment specialists and procurement associates are made up of over 130+ years of combined experience making us one of the industry’s best-trained teams. We opened our doors for business in 1965 with the idea to transform the IT lifecycle through technology, transparency, standards, and processes. Our business continues to flourish and grows with the ever-changing IT industry, letting our reputation speak for itself.

Interested in Learning More About Who We Are?

Send a message, we’d love to hear from you.

FEATURED

What is I.T.A.D. ?

What is ITAD?

You may have heard some fancy term ITAD or SITAD being thrown around the IT world as of late. What is ITAD exactly? We’ll keep simple, ITAD is an acronym for Information Technology Asset Disposition. Some also refer to it as SITAD (Secure Information Technology Asset Disposition). In a nutshell, IT Asset Disposition is the process of disposing of obsolete, retired, or unwanted equipment in an environmentally friendly and responsible manner. ITAD service partners specialize in the processes related to disposing of and remarketing IT assets. Partnering with an experienced ITAD company can also help organizations focus on alleviating expenses as well as increasing the value of their used IT assets.

How can ITAD Benefit You?

IT Asset Disposition service providers can help you get rid of your surplus IT equipment or decommission your current data storage infrastructure. Not only can they dispose of it properly, they will help you get paid for it too! Once they purchase the equipment, they use their own personal end-user network to attempt to recoup as much value from your equipment as possible. There is a still significant life left in some equipment and it could benefit a growing organization that can’t afford to procure new equipment.

Learn More about Data Center Services Here

How does the ITAD Market Work?

The IT asset disposition market is a secondary IT market. It’s used by ITAD companies to remarket the used and retired assets they purchase. Many ITAD companies coordinate with other different ITAD companies to sell the equipment off to the highest bidder. Some ITAD companies rely on a large open network of buyers called broker bin. Others may speak to other ITAD companies directly, although most ITAD companies sell directly to end-users.

How Do I Choose the ITAD Partner Best for Me?

With hundreds if not thousands of ITAD companies existing today, it can appear overwhelming to know which one is best for what your company requires.

Here is a list of things you may want to consider when looking for an ITAD partner:

  1. Does your business have large amounts of inventory and need decommissioning services? You may want to choose a partner who provides:

2.     Does your business have a small amount of used inventory that you can ship yourself? You may want to consider a partner who offers:

3.     Does your business have extremely sensitive data on the equipment that needs to be decommissioned? You may want to think about a partner who has:

You should always be willing to get multiple quotes and get a feel for who works best for you and your organization. No ITAD provider is the same. It is essentially a partnership, should be treated as such.

If you’re in need of ITAD services for your used IT equipment,

FEATURED

LTO-9 Tape Technology (Pre-Purchase Program)

LTO-9 Tape Technology (Pre-Purchase Program)

Our LTO-9 Pre-Purchase Program allows anyone to pre-order LTO-9 tape technology before it is available. This is the ninth generation of tape technology that delivers on the promise made by the LTO Consortium to develop LTO tape technology through at least 12 generations. In an endeavor to deliver our customers the latest technology on the market, we are offering pre orders of LTO-9 tape technology. This gives our customers the best opportunity to receive the latest generation of LTO tape as soon as it’s available. LTO-9 is expected to be available in Fall 2020.

 How to Buy: CLICK HERE | or call us today @ 1-800-700-7683.

How to Sell: For those looking to sell your old data tapes prior to upgrading to LTO-9, CLICK HERE to submit your inventory and we will contact you back within 24 Hours.


LTO TECHNOLOGY FOR LONG-TERM DATA PROTECTION

LTO tape technology provides organizations with reliable, long-term data protection and preservation. With LTO tape drives, organizations can meet security and compliance requirements, while at the same time, save on storage footprint, power, and cooling costs, which can make a significant difference in operating costs for larger library environments.

LTO-9 FEATURED HIGHLIGHTS

  • Lowest cost per GB.

  • Tape offers lower power and cooling costs, plus a lower footprint leads to improved TCO.

  • Linear Tape File System (LTFS) support.

  • AES 256-bit Encryption – Military-grade encryption comes standard.

  • WORM technology – Makes data non-rewriteable and non-erasable, which acts as an immutable vault within your tape library to secure and protect an offline copy from ransomware.

LTO-9 vs. LTO-8

LTO-9 (Linear Tape-Open 9) is the most recently released tape format from the Linear Tape-Open Consortium, following the LTO-8 format which launched in 2017. LTO-9 is expected to double the capacity of LTO-8 to 60 TB compressed. LTO-8 provides 30 TB of compressed storage capacity and 12 TB of uncompressed capacity, doubling what LTO-7 offered.

Although, the LTO Consortium has not announced the data transfer rate for LTO-9 yet, LTO-8 features an uncompressed data transfer rate of up to 360 MBps and a compressed data transfer rate of up to 750 MBps. 

LTO-9 has a similar structure to LTO-8 in that tape drives are backward-compatible with one generation. Essentially, the LTO-8 tapes can read and write to LTO-7 tapes. LTO had typically been able to read back two generations and write back one generation. However, in LTO-8 the backward reading compatibility is limited to one generation. 

LTO-9 also features the same WORM, LTFS, and 256-bitencryption technology as the prior generation LTO-8.

Uses for LTO-9

LTO features high capacity, durability, and portability for a comparatively low cost. Archived data storage is not normally needed on an immediate basis, making tape a solid backup option. More commonly, backup data is used for restores in the event of an incident or data loss.

LTO-9 tapes housed at an off-site location are a fantastic option for disaster recovery. If an organizations main data hub has an incident, they can use the durable LTO9 tapes to recover their data. According to the LTO consortium, once data becomes less frequently retrieved, it should be migrated to tape. 

Tape is particularly useful in industries such as entertainment and healthcare that generate large volumes of data every day and require a long-term data storage option that’s less expensive than disk. As ransomware attacks stay in the headlines, tape provides an offline backup storage option immune to a cyber-attack. Data stored on an LTO-9 tape cartridge does not have to be connected to the network. This creates what is called an Airgap and creates a safety net from a cyberattack.

Pros and Cons of LTO-9 Tape

Tape capacity continues to expand. When LTO-9 launches, it will have enhanced the compressed capacity of the LTO tape products by almost 60 TB in roughly 10 years. As data levels continue to grow rapidly for many groups, capacity is one of the most important aspects of data storage media. Even the cost of tape is low compared to storing 60 TB on other storage media such as disk or flash. Particularly when taking energy and equipment into consideration as a constant energy source is not required to keep data stored on tape.

Other advantages of LTO-9 tape include:

  • A reliable generational roadmap that allows customers to count on a new product every few years, and a capacity that is not far off from the original estimate.

  • 256-bit encryption that guarantees security during storage and shipment. Its offline nature also serves as protection from ransomware and cyberattacks, creating an airgap.

  • A reputation of being extremely reliable, with a lifespan of roughly 30 years. The tape format is also portable, making it remarkably easy to transport.

LTO’s open format also allows customers to access multiple, compatible products. The open format offers intellectual property licenses to prospective manufacturers, leading to innovation and improvements. However, LTO products are not compatible with non-LTO products.

Depending on the amount of data you need to store, cloud storage can be less expensive than tape. In some instances, cloud backup providers provide a free option up to a specified volume of data. Cloud also offers random access, unlike tape. But restoration of data files can be slow depending on data volume and bandwidth.

FEATURED

14 questions to ask before upgrading your servers

Servers are almost always used with specific objectives in mind. Regardless of whether the server is installed in a small business or large enterprise, the server’s role can change over time and sometimes start fulfilling other services and responsibilities. Therefor, it’s important to reviewing a server’s resource load to help ensure the organization improves performance and avoids downtime. 

What do you do when your servers are obsolete and ready to be retired? Unfortunately, server upgrades aren’t as easy as just dropping in more RAM, they require extensive planning. 

The server is essentially the backbone of a businesses’ IT functionality. Acquiring and installing a new server is a large undertaking for any business. Choosing the correct server is important to the value of an organization’s future.

So, what should you consider when it’s time to upgrade? To make matters a little easier, we’ve put together a list of 14 things to consider when upgrading your servers to ensure your organization’s systems perform at the highest levels.

Does it fit your needs?

First, let’s make sure that the new server is able to meet your organization’s IT needs. Determine the necessary requirements, compile this information, and work from there.

Is integration possible?

Check if you are able to integrate sections of your existing server into the new server. This could potentially save money on new technology and provide a level of consistency in terms of staff knowledge on the existing technology. Upgrading doesn’t mean that you need to throw your old equipment in the trash.

What are the costs?

Once you understand the performance requirements, the next step is to gauge which servers meet this most closely. Technology can be very expensive, so you shouldn’t pay for any technology that won’t be of use to your organization’s output.

What maintenance is involved?

Even the most current technology needs to be maintained and any length of downtime could be disastrous for an organization. Ensure that some form of maintenance cover is put in place. Usually, there is a warranty included, but it comes with an expiration date. Make sure you ask about extended warranty options if they’re available.

What about future upgrades?

Considering the future is critical when it comes to working with new technology. The fast pace at which technology develops means that you may need to consider growing your server a lot sooner than you expected. 

Do you have a data backup?

Never make any changes or upgrades to a server, no matter how minor, without having a data backup. When a server is powered down, there is no guarantee that it will come back online. 

Should you create an image backup?

Manufacturers tend to offer disk cloning technologies that streamline recovering servers should a failure occur. Some provide a universal restore option that allows you to recover a failed server. When upgrades don’t go as expected, disk images can help recover not only data but a server’s complex configuration.

How many changes are you making to your servers?

Don’t make multiple changes all at once. Adding disks, replacing memory, or installing additional cards should all be performed separately. If things go wrong a day or two after the changes are made, the process of isolating the change responsible for the error is much easier, than doing a myriad of changes all at once. If only a single change is executed, it’s much easier to track the source of the problem.

Are you monitoring your logs?

After a server upgrade is completed, never presume all is well just because the server booted back up without displaying errors. Monitor log files, error reports, backup operations, and other critical events. Leverage Windows’ internal performance reports to ensure all is performing as intended whenever changes or upgrades are completed.

Did you confirm the OS you are running?

It’s easy to forget the operating system a server is running. By performing a quick audit of the system to be upgraded, you can confirm the OS is compatible and will be able to use the additional resources being installed.

Does the chassis support the upgrade?

Server hardware can be notoriously inconsistent. Manufacturers often change model numbers and product designs. Whenever installing additional resources, you should read the manufacturer’s technical specifications before purchasing the upgrades.

Did you double check that it will work?

Whenever upgrading new server hardware, don’t automatically think the new hardware will plug-and-play well with the server’s operating system. Since the upgrade is being completed on a server, confirm the component is listed on the OS vendor’s hardware compatibility list. It doesn’t hurt to check the server manufacturer’s forums either.

Does the software need an update?

Make sure to keep up on any upgrades requiring software adjustments. You must also update a server’s virtual memory settings following a memory upgrade. 

Did you get the most value for your money?

Sure, less expensive disks, RAM, power supplies, and other resources are readily available. But when it comes to servers only high-quality components should be installed. While these items may cost a bit more than others, the performance and uptime benefits more than compensate for any additional expense.

FEATURED

DTC – Clients

DTC works with some of the biggest names in #business! We’re here to help. Give our sales team a call today and get your #data on the right track! P: 1-800-700-7683

 

#Fortune500 #Software #Sports #Food #Beverage #Hospitality #Entertainment #Healthcare #Retail #Education #Energy #Development

FEATURED

Using IT to Help First Responders Save Lives

Using IT to Help First Responders Save Lives

Imagine sitting in rush hour traffic on Friday afternoon and you see an ambulance approaching in your rear-view mirror with it’s lights flashing. Surely you assume there must be an accident ahead, but what if it were a relative on their way to the hospital?

The question you ask yourself is, “how is there not a better way?” With all of the emerging technology these days, there certainly has to be something to help those who need it most.

Low and behold smart cities. Smart cities are the trend of the future, and the technologies that empower them are likely to become a $135 billion market by 2021.

For first responders, the likelihood of smart traffic lights is a pleasant change. By operating with GPS technology in emergency response vehicles, smart traffic lights can help first responders avoid traffic jams and significantly reduce response times.

Even better is the sensors that can check the structural integrity of buildings, bridges, and roads can increase safety by identifying problems before they cause an accident. Such preventative maintenance can help cities avoid the costs associated with minor injuries to major and fatal accidents.

What could go wrong?

Strategically placed sensors have the potential to improve safety in a multitude of ways. However, city officials are justly concerned that the massive amounts of data collected might not be useful as well as overburdening current systems to their limit.

There are two main obstacles standing between city officials and smart city adoption. The first problem is the issue of integrating new technologies within existing systems, and the second problem is figuring out how to ensure the implemented sensors collect beneficial data.

The Apple Watch is terrific example of how technology can be both helpful and harmful. The ability of the Apple Watch to distinguish between a “fall” and a “drop” could be more than the health-care system bargained for. One could say that the technology has the potential to save lives, especially the elderly.

On the other hand, in the chance of a malfunction, the sensors could create an excessive number of 911 calls when they aren’t actually needed. With possibly millions of the devices in a densely populated city, it’s easy to see how the issue could escalate consume emergency call centers with false alarms.

IoT advantages

In spite of the complexities with integration, the cities that do transition to smart cities stand to benefit greatly. A network of connected sensors and devices can reduce the severity of accidents or eliminate them entirely. For instance, Tesla has installed sensors that intelligently avoid impacting other cars.

Recently the city of Corona, CA migrated to a smart city. They’ve implemented sensors can also provide an incredibly rich picture of what’s happening. Many of the most revolutionary technologies have yet to be invented, but the data gathered by these tools is already helping city officials use their resources more effectively.

For example, officers can distribute Amber Alert information to an entire population, and apps like Waze show transportation officials valuable traffic data so they can reduce bottlenecks. A smart watch might be able to give paramedics vitals of their patients before they even arrive on the scene. No matter the city, smart tech has the potential to improve safety, efficiency and quality of life for residents.

FEATURED

Features of LTO Technology over the Years

Linear Tape Open or better known as (LTO) Ultrium is a high-capacity, single-reel tape storage created and frequently improved by HPE, IBM and Quantum. LTO tape is a powerful yet scalable tape format that helps address the growing demands of data protection.

PROVIDING GROWTH FOR GENERATIONS.

Originally introduced at the turn of the new millennium, LTO technology is currently in its 8th generation out of a proposed twelve generations. LTO-8 supports storage capacity of up to 30 TB compressed, twice that of the previous generation LTO-7, and data transfer rates of up to 750MB/second. New generations of LTO storage have been launched consistently with higher capacity and transfer rates along with new features to further protect enterprise data. Furthermore, LTO storage is designed for backward compatibility meaning it can write back one generation and read back two generations of tape. Currently, LTO-8 Ultrium drives are able to read and write LTO -7 and LTO-8 media, ensuring the data storage investment.

WORM

LTO technology highlights a write-once, read-many (WORM) ability to make certain that your data isn’t overwritten and supports compliance regulations. The LTO WORM operation is designed to give users a very cost-effective means of storing data in a non-rewriteable format. With the increasing importance of regulatory compliance — including the Sarbanes-Oxley Act of 2002, the Health Insurance Portability and Accountability Act of 1996 (HIPAA), and SEC Rule 17-a-4(f) — there is a need for a cost-effective storage solution that can ensure security of corporate data in an permanent format. LTO WORM contains algorithms using the Cartridge Memory (CM), in combination with low level encoding that is mastered on the tape media to prevent tampering.

 

Encryption

LTO technology features robust encryption capabilities to heighten security and privacy during storage and transport of tape cartridges. Sadly, it seems like a common occurrence now when a company suffers a breach in security and endangers confidential or private information. Fortunately, recent generation LTO tape drives include one of the strongest encryption capabilities available in the industry to help safeguard the most vulnerable data stored on tape cartridges. LTO tape encryption is specific to all LTO generations since generation 4 (LTO-4). It features a 256-symmetric key AES-GCM algorithm that is implemented at the drive level. This facilitates compression before encryption to maximize tape capacities and deliver high performance during backup. With a rising number of laws and regulations and financial penalties, a security breach can be damaging for corporations. Data managers are called upon to develop effective security for sensitive data and are turning to tape encryption.

 

Partitioning

More modern generations of LTO technology include a partitioning feature, which help to enhance file control and space management with the Linear Tape File System (LTFS).

Beginning with the 5th generation (LTO-5), LTO technology specifications consist of a partitioning feature that allows for a new standard in ease-of-use and portability.

Partitioning allows for a section of the tape to be set aside for indexing, which tells the drive exactly where in the tape a file is stored.  The second partition holds the actual file.  With LTFS, the indexing information is first read by the drive and presented in a simple, easy-to-use format that allows for “drag and drop” capabilities, similar to a thumb drive.

FEATURED

Why Your Data Storage Strategy Should Include Tape

As most businesses utilize the latest in flash and cloud storage technologies to keep up with extensive data growth, tape technology continues to thrive. The decades-old storage platform has continued to be remarkably dependable throughout the multiple innovations in storage equipment. In fact, tape still offers numerous benefits when it comes to backup, archival and other mass storage of data.

 

Tape’s Total Cost of Ownership (TCO)

 

The cost per gigabyte of tape storage is less than a penny compared to about three cents for hard disk storage, according to Enterprise Strategy Group (ESG). In the long run, tape is also less expensive than cloud storage. The hardware, software, and operational costs are all more costly with other forms of data storage technologies. Additionally, tape has a smaller footprint and uses considerably less power than disk. ESG found that in a 10-year total cost of ownership (TCO) study, an LTO tape solution cost just 14% as much as an all-disk infrastructure, and 17% as much as a hybrid disk/cloud storage solution.

 

The Density of LTO Tape Technology

 

One of tape’s key value propositions is its density. The most recent release of Linear Tape Open (LTO) Ultrium 8 technology provides capacity of up to 30TB of compressed storage.

 

The Lifespan of Data Stored on Tape

 

Yet another major benefit of tape is its longevity of data storage. LTO tape media has a lifespan of 30 years or more, with the average tape drive lasting nearly 10 years. In contrast, the average disk storage lasts roughly four years. ESG conducted a lab audit of LTO-8 drives and found them to be more reliable than disk.

 

The Ever-Increasing Speed of LTO Tape

 

There are still several people that hold to the belief that tape is much too slow to be useful in today’s rapidly evolving IT environment. However, the increases in storage speeds over the 8 generations of LTO tape hasn’t been seen by any other storage solutions. For instance, LTO-7 provides compressed data transfer rates of up to 750MB per second, that’s more than 2.7TB per hour, compared to the 80MB per second of LTO-3 which was released only ten years prior.

 

Data Tape Software

 

Not only had tape increased in density and speed over the years, tape has also gotten smarter. Linear Tape File System (LTFS) allows tape data to be read as just another drive on a network. Users can drag and drop files to tape and can see a list of saved files using an operating system directory. LTFS is an open standard supported by LTO drives from any manufacturer. By making it possible to maneuver files on tape just as you would with disk, LTFS allows organizations to use tape for more than backup and archival. Tape becomes part of an “active” archival infrastructure in which data can be moved to the most cost-effective storage tier at any time. As a result, tape is increasingly used for audio/video and surveillance data, and in big data and regulatory compliance use cases.

 

The Future of LTO

 

LTO technology continues to improve. The LTO Consortium recently finalized the LTO-9 specification and announced plans for the development through 12 generations of the storage technology. LTO-9 is slated for release in Fall 2020. IBM introduced a tape drive based upon the most advanced LTO-8, which offers compressed capacity of up to 30TB (12TB native) and compressed data transfer rates of up to 900MB per second (360MB per second native). The drive comes with AME and AES-256 encryption and write-one-read-many (WORM) capabilities for data protection and is compatible with LTO-7 media.

 

Tape as a lower cost, portable, and simple to use storage solution has always made it a fantastic choice for long-term archival backup. LTO innovations over the past decade have produced unparalleled increases in capacity and greatly superior economics compared to other storage technologies on the market.

FEATURED

Nvidia Takes Shots at Crypto Miners – Limits GPU Hash Rates


Boy oh boy, there is a conflict developing between avid gamers and crypto-miners, and Nvidia is coming to the rescue. Nvidia, the chip company known for its gaming-friendly graphical processing units (GPUs) is in a pickle as crypto-miners are disrupting the market by hoarding the gaming chips for mining as opposed to gaming as they were intended. Of course, the massive increase in GPU sales is great for Nvidia, so why are they trying to settle the beef? It turns out that the company’s true consumers, the gamers, are feeling annoyed that the powerful chips are constantly out of stock.
Prior to the launch of its own cryptocurrency chip, Nvidia has been struggling with shortages of its gaming chips, which were being used to mine cryptocurrency. The better processing power of Nvidia’s GPUs has made the high-end processors target crypto-mining entrepreneurs, which has negatively impacted past Nvidia’s chip rollouts. That means that there’s usually a run on the chips when they first launch, as crypto-enthusiasts try to muscle out the actual target audience, the gamers. GPUs are good at performing cryptographic calculations, like computing hashes at high speed. This sort of algorithm is used at the heart of many cryptocurrency mining calculations.



Nvidia said it was taking an important step to help ensure GPUs end up in the hands of gamers with its GeForce RTX 3060 gaming processor launching February 25. Nvidia is programming the RTX 3060 software drivers to detect cryptocurrency mining algorithms and limiting efficiency, or hash rate, by about 50%, discouraging crypto-miners from buying Nvidia’s gaming GPUs.



Nvidia is limiting the hash rate on all GPU’s going forward

Nvidia released its highly sought-after GeForce RTX 3060 chipset, which includes a bonus for gamers foiling the crypto-mining agenda. Industry experts admire the effort of Nvidia to stay true to its consumer base but are uncertain the move will take the added pressure off of gamers and their setups. The GeForce RTX3060 software drivers have been designed to find specific attributes of the Ethereum cryptocurrency mining algorithm, and limit the crypto-mining efficiency, or hash rate, by around 50%. It’s not that the performance of the chipset has been lowered, instead, the drivers will simply perform some form of throttling of Ethereum-specific use. Ethereum is the second-largest cryptocurrency, behind the infamous Bitcoin.

In a nutshell, Nvidia will attempt to identify the code that is running and enact a denial-of-service action against software it thinks is trying to do Ethereum calculations on the GPU. This appears to be the most logical method given that calculations used for mining Ethereum have a unique signature that the drivers can easily identify. Nvidia’s anti-crypto drivers work by detecting memory usage that looks like an Ethereum algorithm and cutting the hashing speed in half.


Nvidia introduces a new crypto mining GPU

Although it seems as if Nvidia is favoring their loyal fan base of gamers, crypto miners need not fret. They’re also rolling out a chip that crypto-miners can call their very own. Nvidia announced a Cryptocurrency Mining Processor (CMP) product line to address the specific needs of Ethereum mining. CMP products are optimized for the best mining performance and efficiency. Even better, CMP products don’t meet the specifications required of a GeForce GPU, so they won’t affect the availability of GeForce GPUs to gamers. Overall fascination with Ethereum has soared in the past month due to the rapid rise in the value of Bitcoin. When Bitcoin appreciates it tends to lift other cryptocurrencies along with it. The recent rise in crypto-mania has resulted in the Nvidia chip shortage as miners are buying their powerful GPUs in bulk. The current lack of Nvidia’s graphics cards is intensified by persistent demand from miners. The consequence has been empty virtual and retail shelves, in addition to absurdly inflated pricing.

That’s where Nvidia’s CMP products will benefit. The key difference between the CMP products and typical gaming cards is that CMP products lack video outputs. Crypto miners seldom need displays attached to their systems. Usually, they plug multiple GPUs into a rig and manage everything through a web dashboard. Cryptocurrency mining is all about computing performance. If companies like Nvidia can quickly bring crypto mining cards to market, they’re all going to be readily consumed by small home operations and massive crypto mining farms.


Scroll to top