Company becomes a member in Cloud28+, Streaming Video Alliance (SVA) and The Open Group OSDU™ Forum
Houston, Texas – March 11, 2021 – Buurst, a leading enterprise-class data performance company, announced today that the company has joined Cloud28+, The Open Group OSDU™ Forum and Streaming Video Alliance (SVA) to help share and gain knowledge about the needs of communities and vertical markets, as well as the solutions that solve data management needs in the cloud and at the edge.
Buurst’s participation in Cloud28+, the world’s largest community promoting hybrid cloud services and knowledge sharing sponsored by HPE, offers the opportunity to share the SoftNAS and Fuusion technology solutions with a global audience of IT leaders. “The aim of Cloud28+ is to provide customers with the best possible choice of solutions to match their business requirements,” said Xavier Poisson Gouyou Beauchamps, vice president, Service Providers and Cloud28+, Hewlett Packard Enterprise. “It’s exciting for us to unite so many leading service and software providers, such as Buurst, in one place to offer customers a comprehensive answer to their hybrid IT needs.”
The company also joined the Streaming Video Alliance (SVA) to work with industry-leading companies to advance video streaming technology by developing open source solutions, contributing insights into how current Buurst customers are managing video content, and learning from other participants. “We are pleased to have Buurst join the SVA, offering our community valuable experience in cloud storage, as well as their initiatives in edge data transfer technology,” said Jason Thibeault, executive director, Streaming Video Alliance.
Buurst is also announcing its membership in the OSDU Forum to help drive an open source, standards-based, technology-agnostic data platform for the energy industry along with other industry-leading companies. The opportunity to help reduce data silos to enable transformational workflows, while also accelerating the deployment of emerging digital solutions for better decision making in an open, standards-based ecosystem, benefits not only Buurst and its customers, but also the energy industry.
“I am very pleased with our team’s efforts to join communities where vendors and users come together in an effort to understand the market needs and truly make a difference,” commented Vic Mahadevan, CEO of Buurst. “Our work with some of the world’s most innovative tech companies is ultimately making our customers’ life easier. Sharing our vision of data capabilities in the cloud and at the edge with this audience ensures feedback from a broad segment of users.”
Buurst is a data storage and management software company that provides solutions for enterprises’ data use in the cloud or at the edge. The company provides flexibility, security, and access to corporate data at all touch points within the business while reducing storage expenses by eliminating double billing for cloud data management. To learn more, visit. www.buurst.com or follow the company on Twitter, LinkedIn, and Vimeo.
Silicon Valley PR
The 5 pillars of cloud data management
As the lifeblood of your business, data must be easily available in the cloud to boost your agility and ability to innovate, but easy accessibility must be balanced with protection to ensure maximum business value.
As more and more businesses adopt cloud services, seizing on the latest software tools and development methodologies, the lines between them are blurring. What really distinguishes one business from the next is its data.
Much of the intrinsic value of a business resides in its data, but we’re not just talking about customer and product data, there’s also supply chain data, competitor data, and many other types of information that might fall under the big data umbrella. Beyond that there are a multitude of smaller pieces of data, from employee records to HVAC system logins, that are rarely considered, but are necessary for the smooth running of any organization. And don’t forget about source code. Your developers are using cloud-based repositories for version control of application code. It also needs to be protected.
In the past, companies would typically try to centralize their data and lock it safely away in an impenetrable vault, but hoarding data doesn’t allow you to extract value from it. Data gains business value when it’s transported from place to place as needed and available to be leveraged, not locked away in some dark place. People need swift, easy access to data and real-time analysis to make innovative leaps, achieve operational excellence and gain that all-important competitive edge.
Managing the mess
As the importance of data has grown clearer many businesses have been stockpiling as much of it as they can get their hands on with the idea that the value will come along later. Businesses grow organically, so new systems and software are adopted, mergers and acquisitions prompt integrations and migrations, and new devices and endpoints are added to networks all the time. Even the most organized of businesses inevitably ends up with a complex structure and data that’s distributed globally.
Another layer that exacerbates this problem is people. Sometimes your employees will show poor judgement. They may unexpectedly wipe out critical data or accidentally delete configuration files. Disgruntled employees may even do these things deliberately. Then you must consider all the employees and contractors working for your partners and vendors, who often have access to your business-critical data.
To effectively manage your data without shuttering it and blocking legitimate requests for access, you need a solid cloud data management strategy and that begins with five important considerations.
1. Resting data.
Most of the time data sits in storage. It’s often behind firewalls and other layers of security, which it should be, but it’s also vital to ensure that your data is encrypted. It should be encrypted all the time, even when you think it’s safely tucked up in your vault.
If you properly protect your data at rest by encrypting it, then anyone stealing it will end up with lines of garbled junk that they can’t decipher. You may think it’s unlikely a cybercriminal will breach your defenses, but what about a motivated insider with malicious intent or even a careless intern? Hackers most common point of penetration is actually your employees’ devices, whereby they gain a foothold that can be leveraged to go deeper into your networks. Encrypt everything and take proper precautions to restrict access to the decryption key.
2. Accessing data
It’s very important that your employees can access the data they need to do their jobs whenever and wherever they want, but access must also be controlled. Start by analyzing which people need access to what data and create tailored access rights and controls that restrict unnecessary access. Any person requesting access to data must be authenticated and every data transaction should be recorded so you can audit later if necessary. Active Directory is the most common place to manage and control such access today.
Access control should also scan the requesting device to ensure it’s secure and doesn’t harbor any malware or viruses. Analyzing behavior to see if the user or device requesting access falls into normal patterns of use can also be a great way of highlighting nefarious activity.
3. Data in transit
It’s crucial to create a secure, authenticated and encrypted tunnel between the authenticated user and device and the data they’re requesting. You want to make the data transfer as swift and painless as possible for the end user, but without comprising security. Make sure data remains encrypted in transit, so no interceptor can read it. Choosing the right firewalls and virtual private network (VPN) services is vital. You may also want to compartmentalize endpoints to keep data safely siloed or employ virtualization to ensure it doesn’t reside on insecure devices.
There’s no doubt that most companies focus their data protection efforts here and it is important, but don’t focus on data in transit to the detriment of other areas.
4. Arriving data
When the data arrives at its destination you want to be certain that it is authentic and hasn’t been tampered with. Can you prove data integrity? Do you have a clear audit trail? This is key to effectively managing data and reducing the risk of any breach or infection. Phishing attacks often show up in the inbox as genuine data to fool people into clicking somewhere they shouldn’t and downloading malware that bypasses your carefully constructed defenses.
5. Defensible backup and recovery
Even with the first four pillars solidly implemented, things can and do go sideways from time to time when least expected. Most companies recognize the importance of proper backup hygiene today and have implemented backup and recovery processes. Be sure to actually test and validate your ability to restore the backups and recover periodically.
In the cloud, there’s another critical area to carefully consider. Be careful not to put all your data eggs in one basket. Do not store your backups in the same cloud account where your production data resides. That’s a formula for disaster you may not recover from should a hacker somehow gain access to your network and delete everything.
That is, leverage multiple cloud accounts to segregate your backup data from your production data. Be certain to back up your cloud infrastructure configuration information as well, in case you ever need to rebuild it for any reason.
In the unlikely event your production environment should somehow become compromised, it’s critical a copy of all backups and cloud configuration are stored separately and secured from tampering and deletion. One way to do this is to create a separate backup account (on the same cloud or different cloud) with a “write only” policy that allows backup and archival data to be written and read, but not deleted. This protects your business by ensuring your DR systems and backups will always be available should you need them to recover.
By crafting a plan to cover data storage, data access, data in transit, data arrival, and defensible data backup/recovery, you’ve erected five pillars that will be strong enough to bear the load of your company data and withstand the forces which are trying to break in. But there are still many cloud data management pitfalls to avoid. Ensure that you can quickly recover from the most common issues that arise from operating in cloud environments.
You can have the best products and employees in the world, but without data they are powerless, so take steps to ensure it flows freely and safely. Smart data management will empower your staff to leverage the latest cloud technologies, innovate new products and services and differentiate your organization from the competition.
Cutting corners on data protection is a risky business
Consolidating File Servers into the Cloud
1,000 AWS VPC Configurations
It’s important to weigh up the costs and limitations of traditional data centers and consider transitioning your business to the cloud. By modernizing your infrastructure, you can focus on gaining a competitive edge in your core business.
Agility and speed are of paramount importance for most organizations as they try to innovate and differentiate themselves from the competition. The need for flexibility and rapid scalability is driving more and more companies into the cloud, as traditional data centers are no longer proving to be competitive, agile or robust enough.
It should come as no surprise that Cisco predicts 94 percent of workloads and compute instances will be processed by cloud data centers by 2021. But deciding when to take the leap, weighing the costs and risks, and developing a successful strategy is easier said than done. Let’s take a closer look at why companies are ditching those data centers and how they can make the transition as smooth as possible.
The push of traditional data center costs
Traditional data centers are enormously expensive to maintain. To set one up you need to find a suitable space and then fit it out with everything from uninterruptible power systems (UPS) to cooling HVAC units that keep servers from overheating, not to mention extensive investments in storage and networking equipment.
[ Now read: Who’s developing quantum computers ]
All of that comes before you consider the cost of hiring data center personnel with the right expertise to keep things running. These are employees outside your core competency, required just to keep your infrastructure working. Then there’s the ongoing energy costs of maintaining the data center and dealing with maintenance.
Economies of scale are important across the board in data centers, but they make a huge difference when it comes to operating and energy costs. A Ponemon Institute report found that the average annual cost per kW for a data center that’s 50,000 square feet or larger was $5,467, compared to an annual cost of $26,495 per kW for data centers between 500 and 5,000 square feet in size.
It’s not easy to scale up and down quickly, so when you’re not using full capacity, cash is being burned. When you push beyond capacity you’re faced with the prospect of expensive expansion or outsourcing. Even outsourcing the physical data center to a colo facility leaves one effectively in the data center hardware and IT infrastructure business.
The pull of the cloud
As many as 81 percent of enterprises now operate multi-cloud strategies, according to Right Scale’s State of the Cloud Report. Business units want to be free to adopt the very latest tools and technologies. They want to be able to pivot and pounce where they see an opportunity, innovating through machine learning and AI, the automation of software development pipelines, and greater depth in agile data analysis. And they want to do all this unencumbered by an internal IT department and bureaucracy.
Shadow IT is a reality, whether you accept it or not, and that genie is not going back into the bottle. While Gartner studies have found that shadow IT is 30 to 40 percent of IT spending in large enterprises, the Everest Group suggests it’s closer to 50 percent or more. It’s time to embrace the cloud. Maintaining a data center and a layer of IT infrastructure to support it is fast-becoming untenable.
No wonder Gartner names cloud system infrastructure services as the fastest growing segment in a fast-growing market. It’s possible to get better service and greater value by leveraging this competitive space.
While cost is perceived as a big barrier, you can offset the savings you’ll make by shutting down the data center or reducing capacity. Security is evolving from a big concern into a driver for cloud adoption. Major cloud providers are protecting data by encrypting and segmenting it across several locations and they hire the finest talent available because it’s their core business.
The shift to the cloud won’t happen overnight, but make no mistake, it is happening. The big question then becomes – how do you plan for and make the transition successfully?
Creating a roadmap
Most organizations have huge data centers and colos with lots of entanglements based on decades of legacy and acquisitions. There’s no way to unwind all of this quickly. For the health of the business, disruption must be managed carefully, which means assessing existing infrastructure and business assets and developing a plan to move them incrementally. Look at how to start developing new apps in the cloud and divest yourself of legacy infrastructure step-by-step.
What is required to mitigate the risk and enable organizations to transition is some sort of connective tissue that bridges the gaps between your data centers, applications and legacy systems and your chosen cloud environments. Make sure you formulate a strategy that allows you to integrate and access your data without sacrificing control.
There are lots of data management pitfalls to watch out for as you migrate to the cloud, but with the right kind of cloud fabric, you can move things along at your own pace based on your available resources, budget, and business priorities.
Ultimately, leveraging the cloud isn’t about one data center hosting strategy vs. another. It’s about modernizing the infrastructure required to run the business, focusing on your core business instead of IT infrastructure and hardware management, and connecting your business with cloud services to differentiate and compete more effectively.
Available in 3 product editions that run on AWS, Azure and vSphere
SoftNAS delivers cloud storage cost savings by leveraging durable, low-cost cloud object storage. It cuts the time and complexities of cloud data migration projects by up to 90%, turning months into weeks and simplifying live data migrations to the cloud. It enables customers to connect any kind of data to the cloud anywhere in the world.
It addresses the impediments that block real-world cloud adoption, revolving around common cloud storage uses cases: primary and secondary cloud file storage, workload and application cloud migration, hybrid cloud and synthetic cloud backups in partnership with Veeam Software, Inc.
SoftNAS accelerates high-speed global data transfers up to twenty times faster with its patented UltraFast technology, enabling customers to connect remote offices, branch offices, factories and IoT at the edge with the cloud.
Features of SoftNAS 4 address the impediments that block real-world cloud adoption, revolving around common cloud storage uses cases: primary and secondary cloud file storage, workload and application cloud migration, hybrid cloud and synthetic cloud backups in partnership with Veeam. The company conducted customer interviews and identified 18 discrete cloud adoption barriers that SoftNAS 4 solve.
“Based upon our multi-cloud research we uncovered a number of challenges customers are struggling with to take advantage of public cloud infrastructure. SoftNAS 4 addresses many of the customer data management and control challenges head on with its unique combination of cloud NAS, bulk data transfer acceleration and data integration capabilities. SoftNAS is delivering a robust cloud data fabric companies can use as a strategy to more quickly adopt the cloud and save time and money in the process,” said Jeff Kato, senior analyst, The Taneja Group.
“Today we are delivering on our company mission and the vision to become the fabric for business data in the cloud. After three years of relentless company focus, strategic input from our cloud platform partners and customers, we have produced what I believe is the gold standard in cloud data control and management software,” said Rick Braddy, CEO/CTO and founder, SoftNAS. “When I founded SoftNAS six years ago as a disgruntled traditional storage customer, my goals were to reduce the high costs of storage, make it easier to connect applications with the cloud and keep IT in control of its destiny in the cloud. SoftNAS 4 is the realization of those goals and I invite customers and partners to start benefiting from it today,” Braddy said.
SoftNAS 4 features include:
- UltraFast is the company’s patented technology that saves on time and costs by accelerating global bulk data movement up to twenty times faster than standard TCP/IP protocols at one tenth the cost of alternative bulk data transfer solutions. UltraFast accelerates transfers of data into, out of and across private, public, hybrid clouds and multi-clouds.
- UltraFast overcomes up to 4,000ms of latency and up to 10% packet loss due to network congestion, connecting remote locations with the cloud over any kind of network conditions.
Lift and Shift with continuous sync enables live migration of production data and workloads and keeps content up-to-date when moving data to the cloud, between datacenters and/or distributing it to remote locations. Automatic restart and suspend/resume ensures bulk data transfer jobs run reliably while bandwidth schedules enable customers to open or throttle network usage. Lift and Shift works with UltraFast, so migrations can take place over any kind of network anywhere in the world.
FlexFiles integrates and transforms 24 types of data by leveraging Apache NiFi and a set of pre-built data integration processors. With FlexFiles, customers can tackle massively complex data integration projects combining file systems, Hadoop, Redshift, HTTP(S), Web Services, SQL/noSQL, XML, S3/Blob Objects and Custom Data Integrations.
FlexFiles makes connecting a customer’s business with the cloud and SaaS services to integrate through a drag and drop interface with no programming required. It also leverages UltraFast, enabling customers to leverage FlexFiles and Apache NiFi over any network conditions.
ObjFast makes cloud object storage run at near-block-level performance while still taking advantage of object storage pricing, which results in up to 67% cost savings vs. block storage alone. It throttles data transfer to cloud object storage so it’s as fast as possible without exceeding AWS or Azure object storage R/W capabilities.
ObjFast’s patent pending acceleration technology has been optimized, tested and certified for Veeam Synthetic Full Backups, so Veeam cloud backup, copy jobs and recovery runs at near-block-level performance with the public cloud.
SmartTiers is a patent pending automated storage tiering feature that moves aging data from more expensive, high-performance block storage to less expensive object storage, according to customer-set policies, reducing public cloud storage costs by up to 67% and is available for beta testing.
SoftNAS 4 is available in three product editions that run on AWS, Microsoft Azure and vSphere. Customers will be able to launch all company’s cloud products on-demand, directly from the AWS and Azure Marketplaces to spin up cloud data solutions in just minutes with no prior purchasing approvals.
With up to 67% cost savings, SoftNAS 4 connects any kind of customer data, anywhere in the world to the cloud!
SoftNAS®, a cloud data platform company with a deep history of transforming on-premises, costly NAS storage hardware into dedicated, private NAS virtual software that runs on public cloud services, today announced general availability of SoftNAS version 4, It delivers cloud storage cost savings by leveraging highly-durable, low-cost cloud object storage. SoftNAS cuts the time and complexities of cloud data migration projects by up to 90%, turning months into weeks and simplifying live data migrations to the cloud. It enables customers to connect any kind of data to the cloud anywhere in the world.
SoftNAS 4 accelerates high-speed global data transfers up to 20 times faster with its patented SoftNAS® UltraFast™ technology, enabling customers to connect remote offices, branch offices, factories and Internet of Things (IoT) at the edge with the cloud.
New features of SoftNAS 4
New features of SoftNAS 4 address the impediments that block real-world cloud adoption, revolving around common cloud storage uses cases: primary and secondary cloud file storage, workload and application cloud migration, hybrid cloud and synthetic cloud backups in partnership with Veeam. SoftNAS conducted expansive customer interviews and identified 18 discrete cloud adoption barriers that SoftNAS 4 solve.
“Based upon our multi-cloud research we uncovered a number of challenges customers are struggling with to take advantage of public cloud infrastructure. SoftNAS 4 addresses many of the customer data management and control challenges head on with its unique combination of cloud NAS, bulk data transfer acceleration and data integration capabilities. SoftNAS is delivering a robust cloud data fabric companies can use as a strategy to more quickly adopt the cloud and save time and money in the process,” said Jeff Kato, Senior Analyst for The Taneja Group.
“Today we are delivering on our company mission and the vision to become the fabric for business data in the cloud. After three years of relentless company focus, strategic input from our cloud platform partners and customers, we have produced what I believe is the gold standard in cloud data control and management software,” said Rick Braddy, CEO, CTO and Founder of Buurst.
“When I founded SoftNAS six years ago as a disgruntled traditional storage customer, my goals were to reduce the high costs of storage, make it easier to connect applications with the cloud and keep IT in control of its destiny in the cloud. SoftNAS 4 is the realization of those goals and I invite customers and partners to start benefiting from it today,” Braddy said.
General availability of SoftNAS version 4 was announced by Houston-based SoftNAS. Available in three product editions that run on Amazon Web Services (AWS), Microsoft Azure and VMware vSphere, the company claims that SoftNAS 4 “accelerates high-speed global data transfers up to 20 times faster with its patented SoftNAS UltraFast technology, enabling customers to connect remote offices, branch offices, factories and Internet of Things (IoT) at the edge with the cloud.”
New features of SoftNAS 4 include SoftNAS UltraFast, SoftNAS Lift and Shift, SoftNAS FlexFiles, SoftNAS ObjFast and SoftNAS SmartTiers.
According to the announcement, the SoftNAS patented UltraFast technology “saves on time and costs by accelerating global bulk data movement up to 20 times faster than standard TCP/IP protocols at one-tenth the cost of alternative bulk data transfer solutions.”
The SoftNAS Lift and Shift feature will provide live migration of production data and workloads and “keeps content up-to-date when moving data to the cloud, between datacenters and/or distributing it to remote locations.”
SoftNAS FlexFiles leverages Apache NiFi and other pre-built data integration processors in order to integrate and transform 24 different types of data. According to the press release, customers will be able to “tackle massively complex data integration projects combining file systems, Hadoop, Redshift, HTTP(S), Web Services, SQL/noSQL, XML, S3/Blob Objects and Custom Data Integrations.”
SoftNAS ObjFast is a patent-pending acceleration technology that has been “optimized, tested and certified for Veeam Synthetic Full Backups, so Veeam cloud backup, copy jobs and recovery runs at near-block-level performance with the public cloud.”
SoftNAS SmartTiers uses customer-set policies to move aging data from more expensive block storage to less expensive object storage, which the company claims can “[reduce] public cloud storage costs by up to 67 percent and is now available for beta testing.”
“Based upon our multi-cloud research we uncovered a number of challenges customers are struggling with to take advantage of public cloud infrastructure. SoftNAS 4 addresses many of the customer data management and control challenges head on with its unique combination of cloud NAS, bulk data transfer acceleration and data integration capabilities. SoftNAS is delivering a robust cloud data fabric companies can use as a strategy to more quickly adopt the cloud and save time and money in the process,” said Jeff Kato, a senior analyst for The Taneja Group, in a prepared statement.
With SoftNAS 4 released to general availability, customers will be able to “launch all SoftNAS products on-demand, directly from the AWS and Azure Marketplaces to spin up cloud data solutions in just minutes with no prior purchasing approvals,” according to the announcement.
Wendy Hernandez is group managing editor for the 1105 Enterprise Computing Group.
Visit Virtualization & Cloud Review.