C24 customer survey 2013 – 2014 are released

September 15, 2014

Great news from the team at C24. The customer survey results for 2013-2014 are in, an infographic of some of the highlights plus a company update is below. If you would like further information or would like to discuss C24 with one of the team please feel free to call or visit us at http://www.c24.co.uk

 


C24 add further law firms to their client portfolio

September 15, 2014

The team at C24 have also recently won the managed services and hosting contracts for two UK law firms. If you would like more information on the solutions we are providing in this space please visit our stand at:
22nd and 23rd – Thomson Reuters Elite VANTAGE 2014 EMEA Regional Conference, which will be held at the Park Plaza Westminster Bridge Hotel in London. Building upon last year’s success, we will continue to ensure our clients have a cohesive and similar conference experience no matter their location. When you attend VANTAGE, you can trust you are experiencing the highest quality events in the industry.
16th and 17th – The Alternative Legal IT Conference which focuses only on your needs, the needs of mid-tier firms. This ensures that all the speakers, case studies and solutions are appropriate to your size, your budget and your experiences.


HP CS100 with Citrix XenDesktop running video graphics – task workers, knowledge workers and power users

September 4, 2014

Great solution from HP, C24 are currently evaluating the solution for a number of clients allowing us to deliver a hybrid based user model for task workers, knowledge workers and power users.


THE BIG DATA EXPLOSION

August 15, 2014

Steve Fingerhut is a VP of Marketing at SanDisk.  In his inaugural guest blog post for the Metadata Era, Steve discusses how enhancing existing server investments with solid-state memory can speed up Big Data analytics while keeping costs in check.

Metadata readers know better than others that we’re living in an era of data- massive data generated by web transactions, our mobile devices, social media and even our refrigerators and cars. The numbers are stunning. Data is growing at dizzying exponential rates: 90% of the world’s data was created over the last 2 years alone, and by 2020 data will increase by 4,300%

The majority of data produced today is termed ‘Unstructured Data’, which is data that does not fit well into traditional relational database systems.

This category usually includes emails, word documents, PDFs, images, and now social media. To give you a glimpse of how much unstructured data we’re generating: every minute, 100 hours of video are uploaded to YouTube, and more than 100 Billion Google searches are done every month. But what do we do with all of this data?

Analytic Apps Crave Big Data

The giants of the web have long used data as a tool to help them understand customer behavior. For example, Ebay produces 50TB of machine-generated data every day(!), collecting and recording user actions to understand how they interact with their website.

By analyzing data in their focus area, businesses can respond to patterns and make needed changes to improve sales, achieve higher engagement rates, enhance safety or help guide their overall business strategy.

Big Data is no longer just a tool for these web giants. Collecting, analyzing and utilizing data is critical for businesses of any size to remain competitive, and as such, businesses are collecting more data.

When it comes to Big Data, bigger is better. Analytics that are meant to forecast future probabilities (predictive analytics) become far more accurate when increasing data-set size to a massive scale. So companies are expanding projects to help their big data grow even bigger.

Research shows that companies with massive investments in Big Data projects to mine data for insights, are not only generating excess returns but are also gaining competitive advantages. So it’s no wonder data is becoming the most precious commodity of organizations today.

Extending Memory

For data center managers who need to contend with the growth of business data, finding the infrastructure to support storing, archiving, accessing, and processing these huge data sets has become one of the biggest concerns for organizations, and imparts great challenges.

One approach is to divide- and-conquer the problem by distributing the data to separate servers with idle CPU capacity and storage resources. Having many computing units operating in parallel as, say, part of a Map-Reduce platform, is one way—though complex— to handle the problem.

Another idea is to squeeze more performance out of existing computing elements. For many kinds of Big Data applications, gigabytes of data points often have to be manipulated at a single time—for example, in complex statistical operations.

It’s far faster (by orders of magnitude) to have the data in memory at the time it’s needed instead of accessing it from disk storage. But it’s often not feasible to do this for all but the most powerful (and expensive) high-end servers with their very large memory spaces.

An effective route to contend with these challenges is to use SSDs or flash-based disk drives for this task. SSDs have the same type of memory found in mobile devices and cameras, but they’ve been expanded and customized to take care of far larger capacities and data center reliability. Would you be surprised to learn that the big web giants (like Amazon, Facebook and Dropbox) have long moved to include flash-based Solid State Drives in their storage infrastructure?

As such, SSDs deliver far superior performance than legacy storage– 100x that of old-fashioned hard drives. Fitted with SSDs, even standard servers can sort and crunch huge amounts of data without the much-feared “disk penalty”— losing valuable time through seeking and accessing data blocks from a drive’s magnetic media.

Other benefits: without any mechanical parts, companies can eliminate sudden, unpredictable disk failure from their list of risks!

A Real Added Value

But there is more to that. You might be wondering what the cost impact of flash is, and if your organization can afford implementing SSDs.  I actually think that you can’t afford not to, and let me explain why.

As we look at Big Data and analytics, applications are not only coping with huge data sets, but also data from multiple data sources, often requiring tens of thousands (if not hundreds of thousands) of operations per second (IOPS) for each workload.

To realize such high level performance with traditional drives, IT managers have had to ‘stitch together’ a huge pile of hard drives to jointly supply the needed IOPS. But bringing together so many drives not only generates complexities and additional points of failure, it also means managing and paying for more racks, networking, more electricity to power the infrastructure, more cooling costs and more floor space to pay for! As SSDs deliver 100x performance, you will require far less hardware to analyze and perform complex operations, which translates to cost savings both on infrastructure and operation.

Let me add some numbers to support my claims. Recently, we conducted a test using Hadoop, a Big Data framework used for large-scale data processing. We compared the use of hard drives vs. solid-state drives to see not only how much performance gains SSDs can deliver, but also to calculate their impact on costs.  As you may have already guessed, SSDs came out winning big on both ends. We saw 32% performance improvement using 1 terabyte dataset and better yet, a 22%-53% cost reduction, depending on the workload’s pattern of access to the storage.

Getting the Job Done Right

When aiming at optimizing Big Data analytics, there’s still a larger point to be made. It takes two elements to get the job done—a combination of hardware advances—SSDs, for example—as well as smart software. Companies will need both to contend with the oncoming data tsunami and ensure they can make the most of their analytics to remain competitive.


Big data and the rise of augmented intelligence

August 11, 2014


THREE THINGS TO BE AWARE OF WITH LOW-COST DATA BACKUP SERVICES

August 7, 2014

I’m always a little surprised by the reaction from customers regarding off-site storage services.  It goes something like, “Well, the price is so good, that I don’t really need to know anything else.”  From a pure accounting standpoint, I do see their point.

As a company goes down the road of evaluating low-cost backup and disaster recovery service providers, they should stop and “read the fine manual” as we say in IT: in this case, it’s the small print contained in the Terms of Service. I’ve looked at more than a few of these agreements and here are three key points that you should keep in mind:

1. Security Is Ultimately Your Responsibility

You’ll often see language in the ToS that says “they take security seriously” and “it’s very important”, but there’s additional legalese that states the providers can’t be held liable for any damages as result of data loss.

In fact, some of the ToS have a clause that explicitly says you are responsible for the security of your account. Yes, they will encrypt the data, and you may be given the option to hold the security keys. In a very strong sense, the security hot potato remains with you even though they have the data. When calculating the true costs and risks of these services, keep that in mind.

2.  Two-Factor Authentication?

As Metadata Era readers, you’re no doubt wondering about two-factor authentication. As a kind of a virtual commercial landlord, these services hold data for lots of businesses, so you might expect building  security would be tight—“show me your badge”.  After all, these backup services are a magnet for hackers.

I didn’t see two-factor authentication listed as a standard part of the packages of the cloud providers I looked at. However, there are third-party services available that can provide out-of-band authentication through a separate logon solution, but at an extra cost.  And you’ll have to contract separately with them.

3.  Data Availability

You store your data in the cloud with these companies, so you’d expect some promise that the data will be there when you need it.  Of course, on the public Intertoobz, there are limits to what they can be responsible for. Typically there are clauses in the ToS that exclude the digital equivalent of acts of nature—e.g., DoS attacks.

Outside unusual events, these back-up services generally don’t even provide a likelihood of availability—99%, 99.9%, or pick your sigma.  And the most they’re liable for when there’s loss of data dialtone is the subscription fee.

This is not to say that you can’t get a better deal—Service Level Agreements (SLAs) that compensate when certain metrics aren’t met—but for low-price, one-size-fits-all bit lockers, there is usually no or limited opportunity to negotiate.

 

If you already have an outsourced data backup or disaster recovery solution in place with a sensible SLA and you can truly estimate the cost savings, and you’re getting a blue-light deal, then more power to you.

However, for everyone else, a good in-house IT department using purchased archiving or transfer solutions can offer custom security solutions and high-availability, along with guaranteed accountability.


Follow

Get every new post delivered to your Inbox.

Join 848 other followers