by Rick Braddy | Press Coverage
The 5 pillars of cloud data management
As the lifeblood of your business, data must be easily available in the cloud to boost your agility and ability to innovate, but easy accessibility must be balanced with protection to ensure maximum business value.
As more and more businesses adopt cloud services, seizing on the latest software tools and development methodologies, the lines between them are blurring. What really distinguishes one business from the next is its data.
Much of the intrinsic value of a business resides in its data, but we’re not just talking about customer and product data, there’s also supply chain data, competitor data, and many other types of information that might fall under the big data umbrella. Beyond that there are a multitude of smaller pieces of data, from employee records to HVAC system logins, that are rarely considered, but are necessary for the smooth running of any organization. And don’t forget about source code. Your developers are using cloud-based repositories for version control of application code. It also needs to be protected.
In the past, companies would typically try to centralize their data and lock it safely away in an impenetrable vault, but hoarding data doesn’t allow you to extract value from it. Data gains business value when it’s transported from place to place as needed and available to be leveraged, not locked away in some dark place. People need swift, easy access to data and real-time analysis to make innovative leaps, achieve operational excellence and gain that all-important competitive edge.
Managing the mess
As the importance of data has grown clearer many businesses have been stockpiling as much of it as they can get their hands on with the idea that the value will come along later. Businesses grow organically, so new systems and software are adopted, mergers and acquisitions prompt integrations and migrations, and new devices and endpoints are added to networks all the time. Even the most organized of businesses inevitably ends up with a complex structure and data that’s distributed globally.
Another layer that exacerbates this problem is people. Sometimes your employees will show poor judgement. They may unexpectedly wipe out critical data or accidentally delete configuration files. Disgruntled employees may even do these things deliberately. Then you must consider all the employees and contractors working for your partners and vendors, who often have access to your business-critical data.
To effectively manage your data without shuttering it and blocking legitimate requests for access, you need a solid cloud data management strategy and that begins with five important considerations.
1. Resting data.
Most of the time data sits in storage. It’s often behind firewalls and other layers of security, which it should be, but it’s also vital to ensure that your data is encrypted. It should be encrypted all the time, even when you think it’s safely tucked up in your vault.
If you properly protect your data at rest by encrypting it, then anyone stealing it will end up with lines of garbled junk that they can’t decipher. You may think it’s unlikely a cybercriminal will breach your defenses, but what about a motivated insider with malicious intent or even a careless intern? Hackers most common point of penetration is actually your employees’ devices, whereby they gain a foothold that can be leveraged to go deeper into your networks. Encrypt everything and take proper precautions to restrict access to the decryption key.
2. Accessing data
It’s very important that your employees can access the data they need to do their jobs whenever and wherever they want, but access must also be controlled. Start by analyzing which people need access to what data and create tailored access rights and controls that restrict unnecessary access. Any person requesting access to data must be authenticated and every data transaction should be recorded so you can audit later if necessary. Active Directory is the most common place to manage and control such access today.
Access control should also scan the requesting device to ensure it’s secure and doesn’t harbor any malware or viruses. Analyzing behavior to see if the user or device requesting access falls into normal patterns of use can also be a great way of highlighting nefarious activity.
3. Data in transit
It’s crucial to create a secure, authenticated and encrypted tunnel between the authenticated user and device and the data they’re requesting. You want to make the data transfer as swift and painless as possible for the end user, but without comprising security. Make sure data remains encrypted in transit, so no interceptor can read it. Choosing the right firewalls and virtual private network (VPN) services is vital. You may also want to compartmentalize endpoints to keep data safely siloed or employ virtualization to ensure it doesn’t reside on insecure devices.
There’s no doubt that most companies focus their data protection efforts here and it is important, but don’t focus on data in transit to the detriment of other areas.
4. Arriving data
When the data arrives at its destination you want to be certain that it is authentic and hasn’t been tampered with. Can you prove data integrity? Do you have a clear audit trail? This is key to effectively managing data and reducing the risk of any breach or infection. Phishing attacks often show up in the inbox as genuine data to fool people into clicking somewhere they shouldn’t and downloading malware that bypasses your carefully constructed defenses.
5. Defensible backup and recovery
Even with the first four pillars solidly implemented, things can and do go sideways from time to time when least expected. Most companies recognize the importance of proper backup hygiene today and have implemented backup and recovery processes. Be sure to actually test and validate your ability to restore the backups and recover periodically.
In the cloud, there’s another critical area to carefully consider. Be careful not to put all your data eggs in one basket. Do not store your backups in the same cloud account where your production data resides. That’s a formula for disaster you may not recover from should a hacker somehow gain access to your network and delete everything.
That is, leverage multiple cloud accounts to segregate your backup data from your production data. Be certain to back up your cloud infrastructure configuration information as well, in case you ever need to rebuild it for any reason.
In the unlikely event your production environment should somehow become compromised, it’s critical a copy of all backups and cloud configuration are stored separately and secured from tampering and deletion. One way to do this is to create a separate backup account (on the same cloud or different cloud) with a “write only” policy that allows backup and archival data to be written and read, but not deleted. This protects your business by ensuring your DR systems and backups will always be available should you need them to recover.
By crafting a plan to cover data storage, data access, data in transit, data arrival, and defensible data backup/recovery, you’ve erected five pillars that will be strong enough to bear the load of your company data and withstand the forces which are trying to break in. But there are still many cloud data management pitfalls to avoid. Ensure that you can quickly recover from the most common issues that arise from operating in cloud environments.
You can have the best products and employees in the world, but without data they are powerless, so take steps to ensure it flows freely and safely. Smart data management will empower your staff to leverage the latest cloud technologies, innovate new products and services and differentiate your organization from the competition.
Check Also
Cutting corners on data protection is a risky business
Consolidating File Servers into the Cloud
1,000 AWS VPC Configurations
by Buurst Staff | Press Coverage

SoftNAS cuts the time and complexities of cloud data migration projects by up to 90%
SoftNAS delivers cloud storage cost savings by leveraging durable, low-cost cloud object storage. It cuts the time and complexities of cloud data migration projects by up to 90%, turning months into weeks and simplifying live data migrations to the cloud. It enables customers to connect any kind of data to the cloud anywhere in the world.
It addresses the impediments that block real-world cloud adoption, revolving around common cloud storage uses cases: primary and secondary cloud file storage, workload and application cloud migration, hybrid cloud and synthetic cloud backups in partnership with Veeam Software, Inc.
SoftNAS accelerates high-speed global data transfers up to twenty times faster with its patented UltraFast technology, enabling customers to connect remote offices, branch offices, factories and IoT at the edge with the cloud.
Features of SoftNAS 4 address the impediments that block real-world cloud adoption, revolving around common cloud storage uses cases: primary and secondary cloud file storage, workload and application cloud migration, hybrid cloud and synthetic cloud backups in partnership with Veeam. The company conducted customer interviews and identified 18 discrete cloud adoption barriers that SoftNAS 4 solve.
“Based upon our multi-cloud research we uncovered a number of challenges customers are struggling with to take advantage of public cloud infrastructure. SoftNAS 4 addresses many of the customer data management and control challenges head on with its unique combination of cloud NAS, bulk data transfer acceleration and data integration capabilities. SoftNAS is delivering a robust cloud data fabric companies can use as a strategy to more quickly adopt the cloud and save time and money in the process,” said Jeff Kato, senior analyst, The Taneja Group.
“Today we are delivering on our company mission and the vision to become the fabric for business data in the cloud. After three years of relentless company focus, strategic input from our cloud platform partners and customers, we have produced what I believe is the gold standard in cloud data control and management software,” said Rick Braddy, CEO/CTO and founder, SoftNAS. “When I founded SoftNAS six years ago as a disgruntled traditional storage customer, my goals were to reduce the high costs of storage, make it easier to connect applications with the cloud and keep IT in control of its destiny in the cloud. SoftNAS 4 is the realization of those goals and I invite customers and partners to start benefiting from it today,” Braddy said.
SoftNAS 4 features include:
- UltraFast is the company’s patented technology that saves on time and costs by accelerating global bulk data movement up to twenty times faster than standard TCP/IP protocols at one tenth the cost of alternative bulk data transfer solutions. UltraFast accelerates transfers of data into, out of and across private, public, hybrid clouds and multi-clouds.
- UltraFast overcomes up to 4,000ms of latency and up to 10% packet loss due to network congestion, connecting remote locations with the cloud over any kind of network conditions.
Lift and Shift with continuous sync enables live migration of production data and workloads and keeps content up-to-date when moving data to the cloud, between datacenters and/or distributing it to remote locations. Automatic restart and suspend/resume ensures bulk data transfer jobs run reliably while bandwidth schedules enable customers to open or throttle network usage. Lift and Shift works with UltraFast, so migrations can take place over any kind of network anywhere in the world.
FlexFiles integrates and transforms 24 types of data by leveraging Apache NiFi and a set of pre-built data integration processors. With FlexFiles, customers can tackle massively complex data integration projects combining file systems, Hadoop, Redshift, HTTP(S), Web Services, SQL/noSQL, XML, S3/Blob Objects and Custom Data Integrations.
FlexFiles makes connecting a customer’s business with the cloud and SaaS services to integrate through a drag and drop interface with no programming required. It also leverages UltraFast, enabling customers to leverage FlexFiles and Apache NiFi over any network conditions.
ObjFast makes cloud object storage run at near-block-level performance while still taking advantage of object storage pricing, which results in up to 67% cost savings vs. block storage alone. It throttles data transfer to cloud object storage so it’s as fast as possible without exceeding AWS or Azure object storage R/W capabilities.
ObjFast’s patent-pending acceleration technology has been optimized, tested, and certified for Veeam Synthetic Full Backups, so Veeam cloud backup, copy jobs, and recovery runs at near-block-level performance with the public cloud.
SmartTiers is a patent-pending automated storage tiering feature that moves aging data from more expensive, high-performance block storage to less expensive object storage, according to customer-set policies, reducing public cloud storage costs by up to 67% and is available for beta testing.
SoftNAS 4 is available in three product editions that run on AWS, Microsoft Azure, and vSphere. Customers will be able to launch all company’s cloud products on-demand, directly from the AWS and Azure Marketplaces to spin up cloud data solutions in just minutes with no prior purchasing approvals.
Visit StorageNewsletter.com.
by Buurst Staff | Press Coverage

General availability of SoftNAS version 4 was announced by Houston-based SoftNAS. Available in three product editions that run on Amazon Web Services (AWS), Microsoft Azure and VMware vSphere, the company claims that SoftNAS 4 “accelerates high-speed global data transfers up to 20 times faster with its patented SoftNAS UltraFast technology, enabling customers to connect remote offices, branch offices, factories and Internet of Things (IoT) at the edge with the cloud.”
New features of SoftNAS 4 include SoftNAS UltraFast, SoftNAS Lift and Shift, SoftNAS FlexFiles, SoftNAS ObjFast and SoftNAS SmartTiers.
According to the announcement, the SoftNAS patented UltraFast technology “saves on time and costs by accelerating global bulk data movement up to 20 times faster than standard TCP/IP protocols at one-tenth the cost of alternative bulk data transfer solutions.”
The SoftNAS Lift and Shift feature will provide live migration of production data and workloads and “keeps content up-to-date when moving data to the cloud, between datacenters and/or distributing it to remote locations.”
SoftNAS FlexFiles leverages Apache NiFi and other pre-built data integration processors in order to integrate and transform 24 different types of data. According to the press release, customers will be able to “tackle massively complex data integration projects combining file systems, Hadoop, Redshift, HTTP(S), Web Services, SQL/noSQL, XML, S3/Blob Objects and Custom Data Integrations.”
SoftNAS ObjFast is a patent-pending acceleration technology that has been “optimized, tested and certified for Veeam Synthetic Full Backups, so Veeam cloud backup, copy jobs and recovery runs at near-block-level performance with the public cloud.”
SoftNAS SmartTiers uses customer-set policies to move aging data from more expensive block storage to less expensive object storage, which the company claims can “[reduce] public cloud storage costs by up to 67 percent and is now available for beta testing.”
“Based upon our multi-cloud research we uncovered a number of challenges customers are struggling with to take advantage of public cloud infrastructure. SoftNAS 4 addresses many of the customer data management and control challenges head on with its unique combination of cloud NAS, bulk data transfer acceleration and data integration capabilities. SoftNAS is delivering a robust cloud data fabric companies can use as a strategy to more quickly adopt the cloud and save time and money in the process,” said Jeff Kato, a senior analyst for The Taneja Group, in a prepared statement.
With SoftNAS 4 released to general availability, customers will be able to “launch all SoftNAS products on-demand, directly from the AWS and Azure Marketplaces to spin up cloud data solutions in just minutes with no prior purchasing approvals,” according to the announcement.
About the Author
Wendy Hernandez is group managing editor for the 1105 Enterprise Computing Group.
Visit Virtualization & Cloud Review.