Monday, June 24, 2013

Cenzic launches managed services offering for Enterprise application security

Application security intelligence solutions provider Cenzic Cenzic has expanded its Managed Services for Enterprise Application Security to offer four levels of service, including an assessment of compliance-ready available for all types of organizations.

Cenzic CMO Bala Venkat: the lines between web, mobile and cloud are blurring. "
Application security intelligence solutions provider Cenzic Cenzic has expanded its Managed Services for Enterprise Application Security to offer four levels of service, including an assessment of compliance-ready available for all types of organizations. What is included in this new offering? You reveal the characteristics.
This offer allows you to remote vulnerability tests on the cloud, mobile and web applications. After that the risks have been identified, recommendations are provided to assist with data protection. The offer is powered by the company's HailStorm technology.
According to the company, the new offer includes the following features:
Secure code and analysis -monitors and analyzes the software code during development and finds errors and potential vulnerability without executing code;Development and pre scan -test their applications under development and staging environment as part of the software development lifecycle (SDLC);Manual penetration testing – Cenzic's professional security team will conduct extensive testing and customized to specific applications;Secure application Production test – test all web applications, web services and legacy applications safely without impact on the production environment;Production application of real time monitoring and integration of web application firewall (WAF) -provides real-time monitoring of web applications in production, providing real time results for the WAF and automates security policies to protect applications from vulnerability detected; andMobile application testing -test vulnerabilities in applications that have mobile connections.
Cenzic Cenzic CMO Bala Venkat, said in his prepared speech that "the lines between web, mobile and cloud are blur, resulting in new vectors, sophisticated and continually evolving online threats.
"This offer helps lower managed capital expenditure and operating costs, allowing companies the confidence that they are up to date with the latest security threats to their business," said Bruno.
For more information on Cenzic's partner program, please visit the company's Web site.

Autotask comunità Live: MSP completo News Recap

Complete news coverage Autotask community Live 2013 for managed service provider (MSP).

The core message of AutoTask CEO Mark Cattini was on catching waves of growth as a cloud, mobile, social and data.
Autotask community Live 2013 as wraps, there was a lot of news business and cloud management for managed service providers (MSP). Attended the MSP and customer or event detected and remotely, here's a complete list, related links and analysis of MSPmentor.
Our cover features:
1. Autotask community Live: the seven voices In the Hall--including one rather interesting about Microsoft Office 365 cloud partner program Open.
2. Microsoft, Continuum Cloud update, managed services strategies-including more Office 365 insights most useful guide services managed by Continuum VP Steve Ricketts.
3. Autotask CEO: IPO potential Cloud services? -CEO Mark Cattini describes why the company has met with financial experts and bankers. No IPO is expected at this point. But Autotask now has much more knowledge about its assessment, I suspect that ...
4. Autotask CEO: there are large amounts of data, Smart is the platform – take a look more closely at this blog entry because it includes extensive views from many of the top executives of Autotask. This is the blog that you want to read for all Autotask MSP key Conference Messaging.
5. Autotask community Live: 20 questions-this was our Conference Preview, raising the key questions that should explore MSPs.
That's all for now.

Apple iOS 7: Can You Manage New iPhone, iPad OS?

When Apple's iOS 7 arrives for iPad and iPhone in the fall of 2013, MSPs and mobile device management (MDM) software companies better be ready.

Apple says iOS 7 for iPhone and iPad will arrive in the fall of 2013. MSPs will be ready?
Apple (AAPL) today announced iOS 7, an update of the mobile OS for iPhone and iPad. iOS 7 is now in beta and will be available for smartphone and Apple Tablet in the fall of 2013. For MSPs, RMM, software vendors and management companies (MDM) on your mobile device, the race is on for 7 iOS is easily managed by remote systems.
Of course, a long list of software companies RMM (remote monitoring and Administration) already support Google Android and iOS 6. We will be checking in with Continuum, GFI Software, LabTech Software, Kaseya, level platforms, N-able (now owned by SolarWInds) and others in the industry to see how long support the new iOS release 7.
What consumers and business users expect from iOS 7? According To Apple:
"7 iOS is completely redesigned with subtle movement, a stylish color palette distinct functional layers that make it feel more alive. The typography was refined for a cleaner, simpler, and use of translucency and motion makes even simple tasks more engaging. iOS 7 has hundreds of great new features, including Control Center, notifications, enhanced Multitasking, AirDrop, enhanced images, Safari, Siri and introduces iTunes Radio, a free Internet radio service based on the music you listen to on iTunes.
The list of iOS features 7 deeper in detail include:
Control Center. A simple shot allows users to access a centralized control center for airplane mode, Wi-Fi, Bluetooth or do not disturb and more, Apple said.Notification Center is now available from the lock screen, so you can see all your notifications with a single blow, Apple said.Improved Multitasking allows developers to enable any multitasking app in the background with a new API. In addition, Apple says, users have the ability to switch between their applications more intuitive and visual.AirDrop is a completely new way to share content with people close, Apple reported.iCloud Keychain can store passwords and credit card information across all your devices, Apple said.
Apple says iOS 7 beta software and SDK are available immediately for iOS Developer Program members at developer.apple.com. iOS 7 will be available as a free software update for iPhone 4 and iPad 2 later and later, mini iPad and iPod touch (5th generation) this fall, the company indicated.

Sunday, June 23, 2013

Business Leadership Lessons from Game of Thrones and the House of Stark

CELESTIX CEO Tim Ager recently took a look at what business leaders can learn from Game of Thrones and more precisely from the successes and failures of the hero Ned Stark. Here's what you need to know in order not to lose your head.

Despite his previous success and its high level of integrity, Ned Stark was able to adapt to his new role.
Rob Stark is dead, the Stark family ever be reunited in winter or anywhere else and hopes of fallen heroes to ruin. That is basically how season 3 synthesis of Game of Thrones closes. What does this have to do with managed services and business? CELESTIX CEO Tim Ager recently pointed out in his blog that the first season of this HBO series can provide useful lessons in leadership and management, especially if you do not want to "lose his head" as a hero who made Ned Stark. Here's the advice of Ager on how to take your career to the next level.
Ager said that Ned Stark begins the series as a successful leader, but is put to the test when he reluctantly accepted the position as the "hand of the King" – basically the COO who must deal with the day-to-day reign of Westeros.
As Ager points out, Stark has an impressive curriculum that leads to work:
Born into a noble family and ben educatedTrained and warfareSuccessful leadership in numerous campaigns over yearsObtained many more awards related to its northern TerritoryTrusted achievementsRuler friend, right hand man and supporter of the King
As Ager notes: "I know you did well in your core competency, earned your Spurs and they progressed your career. Take a look at your resume shows exemplary credentials. And then you are asked to step up and become a VP or join the C-suite level ".
Ager argues that the same skills you have in your new place of work are not the same ones that will allow you to be successful.
His new role of hand puts Stark in a new situation, among others who hold the power, but they have different agendas. Stark has no experience in the field, having spent his career commanding subordinates and never cultivate the ability to deal with the ambiguity of policy.
And while any topic criticizing Ned Stark, the hero of the first season that has retained its integrity and morality despite horrendous circumstances, probably will be met by fans and fans GOT Stark as heresy, Ager is right. Ned does not adapt to his new situation, and all the characters are still feeling the repercussions of that at the end of season 3.  But it does mean that you should change who you are to fit your new role?
"Now I'm not for a minute suggesting that based on the experience of the poor old Ned it changes everything and transform yourself into something you're not. After all your strengths are the main reason why he was asked to step up and take on a larger scale, "Ager wrote. "However, we should all listen to the allegory of Ned Stark and realize that unless we adapt to our new leadership roles we stand a greater chance of failure. Successful Leadership is on many things and it means different things in different organizations, but a foundation of successful leadership is to continue to be good at what we did during the adaptation to be successful enough at things that we did. "
Expect more GOT management and leadership lessons from Ager, as an end of season 3. Any forensic analysis on errors of Rob Stark? Triumphs of DANERYS? Career path of Jon Snow?
Meanwhile, you have collected every management wisdom from books, movies or television? Tell us in the comments.

IBM SmartCloud, PureFlex: MSP double-crossed?

IBM SmartCloud PureFlex systems and will move into the spotlight of the MSP IBM Summit.

IBM GMs Andy Monshaw (left) and Deepak Advani (right) will probably offer PureFlex and SmartCloud updates during a Summit of MSP.
IBM SmartCloud PureFlex systems and will take center stage at the Summit of MSP IBM this week in Las Vegas. The big question: can IBM convince more MSPs to standardize its hardware infrastructure and converging cloud platforms (especially in the midst of acquiring SoftLayer)?
Among the leaders to address these and other problems this week at IBM's Summit:
Andy Monshaw, General Manager, SystemsDeepak Advani PureFlex, general manager, Cloud and intelligent infrastructure
IBM's effort to engage MSPs began in 2011. Original effort involved in data center hardware sales. But IBM has been expanding its focus to include MSP and managed services that can connect MSPs cloud.
Not coincidentally, IBM MSP definition has expanded to include a wide range of companies offering on-and off-premise that IT services. It is clear that IBM is trying to engage everyone from telecommunications systems integrators with its portfolio of hardware, software and managed cloud offerings.
Hardware Giants home servers (Dell, HP, IBM and Sun Oracle) I think IBM has a sharp focus on MSPs. But those are MSPs generating big business for IBM? It's a safe bet Monshaw and Advani will provide updates when they take the stage at the Summit of the 11 June IBM.

Managed services: 7 Blogs MSPmentor didn't write, 7 June

This week's news services managed provider (MSP), gossip, and rumors involves Autotask, LabTech Microsoft Surface, contour, tablets and more.
Our team has spent the week at Autotask community live, automation nation and LabTech Cisco Partner Summit. Next week, I'll be IBM Summit surface Edge/MSP-while keeping an eye on as far as the next level of MSP platforms Community Summit and TruMethods Schnizzfest. Meanwhile, there are reports of seven managed services provider (MSP) and Blogs that the MSPmentor team did not have the opportunity to write for the week ending June 7, 2013.
7. Bigger Picture: twice in recent days I have had extended conversations with LabTech Software CEO Matthew Nachtrab. We covered a wide range of things: LabTech's performance over the past year, the company's evolution, the commitment to integration with Autotask and Tigerpaw Software (you read that right) and much more. Stay tuned for a more complete review.
6. compressed Surface: I've seen some users of LabTech conference area. Neri was among them. I am still a long run I believe on the surface collapsed. But please can we have a channel partner program, Microsoft?
5. Career moves: at least three high-profile executives, high level MSP sector are close by changing jobs. Big names. Great moves. As soon as possible.
4. limit of earnings momentum: CEO Gary Read, Nimsoft's ex, caught lightning in a bottle – again? I hope to share information and insights from a recent conversation soon.
3. birthday: have a great one, Charlene.
2. Who's Next?: so, the Government of the United States is somehow monitoring the Google, Microsoft and others. Sorta makes you wonder: big brother can find time and resources to start infiltrating small datacenters MSP?
1. did you notice this?: Autotask has MSP training sessions in Microsoft offices across the United States at the same time, Autotask is tighter integration with Microsoft Office 365 offers. Hmmm ... just how close are the two companies will become?
That's all for now. Thank you for your continued readership.

Tuesday, June 18, 2013

In-memory databases are the answer, or part of the answer?

Before discussing a topic is always good to begin with a definition and what better way to find a relative to see Wikipedia omniscient (as we got before).  Wikipedia has this to say about In-memory databases.

Source: Wikipedia, The Free Encyclopedia

Or simply paraphrase, In-memory databases are more efficient than traditional database management as do not face the same i/o constraint of reading or writing to disk.

Really like the ubiquitous hype ' big data ', In memory is in vogue for 2013. However, let's not forget that in-memory databases have been around for 25 years or more and was largely an economic constraint (the price of RAM) rather than technology that prevented their use being more prevalent. However the pace of technological progress saw the price drop of RAM greatly with numerous sources identify the price of memory has dropped the 33% per annum for the last two decades and the same sources expect a similar rate of reduction in the years to come.

It's hard to deny that business demand for performance in Google analytics is always there. After all, as soon as possible the business has the answers before we can make decisions, which generally lead to greater business value as a result.

However, we still need to go back to the economic argument of storing data in memory. You must be in memory, more to the point it can afford all your data to be in memory all your data?

Yes, the price of memory has dropped significantly and continues to do so but the memory remains 80 times more expensive than disk storage. Add to the equation the rate at which the data volumes are growing and the need to capture, store and analyze this growing data volumes and you are left with two opposing tendencies that might not ever, or at least not for the foreseeable future, reach a balance point acceptable price.

So logically and (being from a background of accounting) economically there must be more storage options for the data within a unique ecosystem analysis in memory will play a significant role. Of course for the data that is used heavily on a day to day basis then there will be a more convincing business case to store this data in memory to provide benefits that are necessary for business, but for the historical data infrequently accessed or do the numbers really stack up to to have this sitting in memory of data?

For information about how Teradata intelligent memory addresses versus storage equation of use click here.

David Hudson is a Senior Consultant with Teradata solutions ANZ. Has 10 years experience in data storage, primarily focused on Enterprise Data Model solutions. This includes data integration, ETL design and logical data modeling.


For retailers this summer, the background is inventory

Here's some encouraging news for resellers:

According to the National Retail Federation, April retail sales (excluding restaurants, gas stations and cars) increased seasonally adjusted 0.6 per cent compared to last month, and have increased 3.9 percent on year-on-year. NRF attributes the rise to improved employment data last month, housing prices and a record bag.

Of course, whenever such results make headlines, experienced retailers know respond with only cautious optimism. They always counter sales positive relations with two questions:

1. the increases continue? and,

2. If they do, it will be ready?

No one can answer the first question with absolute certainty. You'd need a crystal ball.

Fortunately, however, more and more retailers are finding that an answer to the second question is to reach, and that is turning their attention in confidence.

You see, it is only natural for dealers to be worried obsessed with stock levels. The flow of product inventory is the lifeblood of any retail business. Is what keeps them at night. Is the background of success.

But now, retailers can take the guesswork out of inventory. Can use analytics to better forecast product needs, optimize inventory flow and replenish according customer shopping patterns. In fact, we just announced an advanced solution that makes these processes not only possible, but easier than ever.

And the timing for this technology could not be better. In today's challenging economy, retailers have little margin for error. In addition, they are doing is a priority for improving the customer experience to be more in tune with customer preferences.

According to the fifth annual report to Benchmark Merchandising Retail Systems Research (RSR), which provides insight into business and technology challenges facing the retail sector extended: "Out-of-stock and inventory performance remain top-of-mind (especially for large retailers and those who sell goods), the concerns of retailers about understanding customer preferences and their ability to meet those preferences with new ideas on pricing and promotions-have become their top business challenges."

What retailers can do to become more responsive to consumer demand? RSR suggests these four actions:

Get predictive.Use modern methods of forecast for discover insights customer purchase behaviors and preferences.

Get the reagent. Feedback is essential. When you have access to real-time information, you can automatically respond to trends.

Get automated.You need the right tool for the right job. The inventory and warehouse environments today require large amounts of data for analysis and automated technology.

Get science.Intuition and gut feelings only get you so far. Now retailers need to make data-driven decisions in order to remain competitive in the global marketplace.

I would suggest another:

Go ahead.It's time to update your strategies. Don't worry: you don't have to boil the ocean. But, you need to get started.

Consumer confidence continues to rise in the coming months? NRF is forecast to increase moderate retail sales throughout the summer.  In both cases, you'll want to be ready.

-Darryl


More is better, with the right tools

Most data are better, right? Yes, provided that the organizations have the expertise and platforms to capture, analyze and standardize data. But many companies are learning that their types of diverse data and analysis needs to overcome their existing capacity. The answer is Unified Data Architecture with Teradata ® ™, which enables transparent movement of data in and out of complementary systems. According to industry experts, this unified environment of data and data analysis enables organizations to leverage all data of new insights and new business opportunities.

"A unified data environment recognises that there are many tools of analysis and number paths to get what you need, and you have to have the right platform for the right workload," says Tony Baer, principal analyst at Ovum.

The unified data architecture is purpose-built to address all forms of data, says Hortonworks vice President Shaun Connolly in his discussion of how an integrated environment derives value from data in new ways. "The unified data architecture is purpose-built to address all forms of data. That really resonates, especially with large companies, "says Connolly.

Jennifer Niemela
Executive Editor
Teradata Magazine

Teradata Teradata, Unified Data Architecture, data architecture, UDA, Shaun Connolly, Hortonworks, analytics, data analysis, data unified environment


The battle of Retail: Bricks vs. clicks: that is a leader and ... who will win?

I live in an airplane. Not literally of course, actually my residence is in Amsterdam-the original one, not the one in New York. The reason for that flying is to meet with retailers around the world. (Why? Professionally, Teradata is a partner and service provider for global retailers, and consult with companies that are forging ahead in the battle in multi-channel retail course.) The point of this blog is to share some of these retail discussions specific to the brick against battle click and set the stage for the next Blog.


My idea is to address some battle strategies and tactics-deep-to air them and solicit feedback from reflective, as retail brands are making decisions and laying Foundation for the future (ir).

What do you mean by 'retail battlefield '? I am referring to the obvious rivalry continues pure-play e-tailers and store-based retailers long. It began at some point, as online businesses-the likes of Amazon, eBay and others-led product price and availability through the floor, making it easier to buy on-line, that traditional stores have new competition that was faceless and virtual. The resulting conflict was everywhere from chaotic to catastrophic, but it certainly has changed the face of retail. We have fodder for battle across the spectrum of specialist retailers, luxury retail, big box stores and even the food [for a specific example of brick vs battle clicks check out this story].

Adaptive resellers are not rolling and waiting for inevitable decay. Although many have experienced declining sales, Show higher returns, rooming and forward-looking organization are standing and modeling their brethren click slicker by adding more product detail, service shop & flavor and designed a new concept in retail channel seamlessly (regardless of the nomenclature – cross the channel, multi-channel, omni-channel, is on all everywhereat any time, in any case shopping experience). The late dealer however suffers from price wars, internal conflicts and uncertainty in the future.

My discussions with traditional brands are wide and varied. They include companies trying to make sense of the strategy necessary to serve the technology-the power of consumers. This discussion often strategy leads to internal dissension and inconsistency between marketing executive and technology professionals. In addition, revamping a ' traditional ' picture to be super-charged with entertainment (gamification) and convenience (mobile) to reach younger consumers has also encouraged lively discussion. The trending topic, for the store-brands, is the ability to drive traffic and provide a unique customer experience as the ' face ' of the brand.

After all my meeting with global brands, get a bird's eye view of the brick versus retail battle. Resonant themes of interest (like these and more) will be the focus for the upcoming discussions include:

Paper, paper and more paper ... as we move from the promotion of interactions?Personalization-what does this mean, where to start and how to executeWho is the customer, through all points of interaction and no matter what, whether we interact appropriately at all times, in every dimension?Millennial clients, how to engage and enthrall themIngratiate our brand in consumer livesCoordination daily rhythm of a Marketing team that is driving at a reckless pace and challenged him to keep peace – the convergence of interactive messaging CMO and CIOReal time – there is a silver bullet – frequency, timeliness, content

We will dive into these as the battle wages on in a real way as Darwin – a seemingly struggle at the end. Analysts and businessmen alike are betting on the outcome of the ' battle ' – but those of strategy the question is not about ' vegetables ', as ironically may suggest in this article . After all, Amazon not only sell more books.

Stay tuned ...


The Analytics Lifecycle Allows an Iterative Approach to Customer Analytics

AppId is over the quota

In un recente seminario web, Christine Richards, direttore dei servizi di conoscenza presso l'Istituto di Analytics Utility condiviso tre raccomandazioni per utilità guardando verso implementando customer analytics.

 Contatore dati sono solo una fonte. Mentre Richards ha convenuto che il contatore dati sono importanti, ha anche esortato utilità di considerare altre fonti di dati e sistemi che supportano l'analisi operazioni dei clienti da tutta l'azienda.Automatizzare per efficienza. A causa del volume di dati, è fondamentale per determinare la modalità di filtro che informazioni ha bisogno di intervento umano e la cui analisi possono essere automatizzato per ridurre l'impegno di tempo, liberando ai dipendenti di concentrarsi su questioni più complesse di cliente.Creare un team interfunzionale. Mettere la squadra giusta nel luogo ora e l'assegnazione di risorse dedicate servirà utilities cos? come gli sforzi di customer operations analytics espandere.

Infine, Richards ha esortato utilità per visualizzare Google analytics come uno strumento per raggiungere un obiettivo strategico, non l'obiettivo di per sé. Noi non potevamo più d'accordo. Il valore in Google analytics è la capacità di acquisire conoscenze che consentono di utilità migliorare affidabilità, servizi e relazioni con i clienti attraverso la catena del valore energetico.

Brian Jore, direttore di Utility Business Consultant presso Teradata seguito presentazione di Richard per discutere un approccio iterativo di customer analytics le operazioni che lo rendono molto più facile per utilities iniziare. Jore inizia dalla definizione dei dati integrati Google analytics come, 'che unisce e correlazione di dati disparati per scoprire nuove intuizioni di business e ottimizzare i processi.

I motivo utilities necessità integrato dati per eseguire analisi sono quello di superare la necessità di acrobazie di dati. Jore citati esempi di processi manuali, fogli di calcolo, skunk works e disparate applicazioni che lo rendono difficile da tirare insieme tutti i dati e utilizzarlo per spostare l'ago su operazioni aziendali.

Per combattere questo status quo, Jore introdotto l'analisi del ciclo di vita.

Applicando questo costrutto customer analytics, un'utilità inizia con dati di contatori intelligenti e lo circonda con altri dati del cliente per costruire il repository dei dati integrati su cui possono essere applicati segmentazione. Questo approccio di segmentazione consente rapidamente il targeting dei clienti che si adattano al profilo per i servizi che si sta creando. E lo dimostra anche il valore dei dati integrati per guadagnare buy-in da leader del programma di utilità per ampliare la portata del vostro programma di Google analytics.

Questo approccio dimostra come le interazioni con i dati possono funzionare. Ad esempio, dati integrati rende facile legare in un approccio di marketing multicanale e semplificare i processi perché riutilizza le regole di business stesso, consistente nel data warehouse per identificare questi clienti consentendo lo sviluppo di messaggi di marketing personalizzati per l'uso in call center.

Il ciclo di vita di Analytics dà utilità un grande ritorno sui loro investimenti creando un quadro che è un fondamento per un processo di business creare nuove intuizioni che consentono agli utenti aziendali di interporre se stessi quando un processo non funziona come previsto. Questo è molto diverso — e più penetranti — che un insieme di cruscotti e report a fine mese.

La ruota di three-step intende continuamente il cerchio in maniera cronometrica. Ogni passo fornisce valore, ma il centro è il concetto di dati integrati che continuamente si evolve come il ciclo si ripete nel tempo. Il quadro di Analytics rimuove efficacemente l'aspetto manuale della raccolta e assemblaggio dei dati per ciascuna analisi.

Passo 1: Analizzare & esplorare:
Questo passaggio non è sulla generazione di un report. Molti programmi di utilità non so qual è il requisito di uscita quando iniziano. In altre parole, essi lo saprete quando lo vedono. Utilità di bisogno di un ambiente che consente loro di accedere ai dati, applicare diverse ipotesi e segmentazione dinamica che conduce alla scoperta.

Sostenitori di Jore sfruttando tutti toccano punti con un cliente per capire dove si trovano le opportunità. Questo processo aiuta a facilitare approfondimenti per gli utenti business regolari, ma anche per gli analisti più avanzati in termini di fornitura di correlazioni e analisi del percorso; i passi principali che portano a un determinato comportamento del cliente. Una volta scoperto, l'utilità pu? monitorare tale comportamento attivamente e con ogni iterazione meglio prevedere che cosa potrebbe causare il comportamento del cliente.

Passo 2: Allineare & ottimizzare:
Prendere le intuizioni e i segmenti di clientela individuati nella prima fase di lavorare verso scoprendo la combinazione ideale di prodotti e servizi per ogni segmento. L'utilità pu? anche imparare in quale misura tali profili sono stati penetrati e ingrandita.

Inoltre, marketing di utilità pu? cominciare a determinare l'efficacia di canale marketing individuando come clienti rispondono alle offerte e comunicazioni collocato in diversi canali, come ad esempio web, call center, e-mail e customer portal.

Con queste intuizioni in mano, canali possono ora essere ottimizzati per approfittare delle opportunità. Gli esempi includono la capacità di aumentare l'adozione di programmi di risposta richiesta per i servizi regolamentati o a scala lead generation per i fornitori di energia al dettaglio.

Passo 3: Produzione & Tracking:
Attraverso il lavoro svolto nei primi due passaggi, utilities svilupperanno una serie di regole di business. Questi servono come parametri che possono essere utilizzati coerentemente attraverso canali per produrre l'output che stai cercando. Con questo processo automatizzato, Utility inizierà a non dover interporre e manualmente kick off relazioni. Invece, sarà sufficiente eseguire.

Con segnalazione automatica, gli utenti aziendali possono iniziare a prendere azione, capire le tendenze, conoscere nuove opportunità e identificare le aree dove comportamento del cliente non è in movimento in una direzione che vuole che l'utilità — problemi di credito e raccolte, per esempio.

Una volta che sono stati identificati i comportamenti specifici, possono sempre essere apportate modifiche per raffinatezza. Ma ti consigliamo anche di tenere traccia di questi parametri per capire le diverse transizioni. Questo conduce al passo 1: analizzare & esplorare per continuare l'evoluzione delle intuizioni si sta guadagnando da integrato analisi di dati.

Essenzialmente, il ciclo di vita di Google Analytics permette di utilities agire sui dati, piuttosto che spendere tutto il loro tempo di ricreare i dati e le analisi.

Per ulteriori informazioni, Guarda il webcast on-demand.


Monday, June 17, 2013

Why You Need To Start Focusing On Social Intelligence

AppId is over the quota

How's your company's social intelligence?

Ask a roomful of business leaders that question, and you're likely to get a variety of different answers, ranging from "Great!" and "Getting better every day!" all the way to "Remind me". What's social intelligence? "

If you're on the latter end of that spectrum and still feel uncertain about how social networks like Twitter, Facebook, YouTube, SlideShare and/or LinkedIn can affect your business, I have to be honest: It's time to up your game. The truth is, social networks like these and many others are affecting your business – and if you're not paying attention, you're not only losing valuable ground to your competitors, your losing invaluable relationships with your customers.

Don't get me wrong. I'm not here to "hype" engagement across social media platforms. And the last thing I want to do is add to the list of problems keeping you up at night.

But, if you're a business leader interested in revenue growth – and who isn't? – you need to start focusing on social intelligence. Every day now, billions of messages are shared across social platforms. How many of them mention your company, product, service, industry, your employees ... or even you? Are those comments glowing recommendations, nasty complaints or something in-between? How many are threats? How many are opportunities? Don't you want to know?  How well are you "listening" to your marketplace?

Companies need to become more attentive and improve their understanding of how social media impacts revenue – both positively and negatively. After all, social media conversations are now shaping the marketplace more – and increasingly faster – than most companies can keep up with.

For example, new social interaction metrics are changing the way value is assigned to television audiences. In other words, the networks aren't necessarily gauging a program's success solely on the basis of audience size. They're also considering the social media activity and habits of certain audiences. As Kevin Glacken explained last week at Smart Data Collective:

"It's understandable that the traditional audience size approach has survived for the past 50 years or so given the one-way channel television has been over that period. However, today with the countless ways for viewers to immediately react to and interact with their television programs, quality of viewers in terms of interaction and insight is matching sheer audience volume in many cases. "

"Programs that have high IVRs (Interactive Viewer Ratings, a metric developed by ListenLogic) can gather to deep, multidimensional understanding of their audience members via advanced social intelligence in terms of interests, activities, likes, dislikes, attitudes and behaviors. This ultimately can deliver better targeted and effective messaging to advertisers. This increases the efficacy, and thus the value of the program. "

At Teradata, we know it's increasingly essential for all companies – whether they're B2C or B2B – to tap into the insights available from social media big data streams. Even better, we know that true business value emerges when social intelligence is integrated with other internal practices. Put another way:

Big date scope + freshest delivery date + behavioral and contextual data + comprehensive predictive analytics + real-time messaging = Maximized business results.

With its thorough and detailed data-driven understanding of customer behaviors, Teradata's Interactive Customer Engagement now empowers marketers to:

Combine online and offline data to reveal new insights, Deliver optimized, personalized real-time offers, based on historical and contextual in-session data and search results;Offers coordinates across online and offline channels in real-time;Leverage industry-leading campaign management and email delivery functionality to identify and communicate more relevant and personalized offers to the customer;Gain a clearer view of both the customer's path to purchase as well as his or her journey through various channels;Have a better understanding of paid channels ' roles and effectiveness in converting browsers into buyers; Deliver relevant, meaningful, real-time offers across multiple channels by learning from previous interactions and self-aligning with company goals.

The big data generated by streaming social media offers a wealth of information about your brand, your industry, your competition and most important of all, your customers. It's time to start listening. It's time to start understanding. And it's time to start putting all this information to work for you.

-Darryl


Defining the Appliance Marketplace

AppId is over the quota

Abbiamo avuto recentemente l'opportunità di parlare con Roxanne Hendricks, Product Manager di Teradata elettrodomestici nuovi Teradata ® Data Warehouse Appliance 2700. Questa premier piattaforma offre prestazioni migliorate e migliorata gestione dei workload.

Hendricks sottolinea, nel suo Q&A, che il mercato è cambiato significativamente poiché Teradata ha introdotto il suo primo apparecchio nel 2008. Negli ultimi cinque anni, le imprese sono venuti a fare affidamento su apparecchi di Teradata per disaster recovery, piattaforme ibrido, sabbiere di sviluppo per Google analytics e anche per integrare dati integrata magazzini. Non importa l'impiego, organizzazioni non solo guadagno rapido accesso alle informazioni, ma anche il tempo più veloce di valore.

Brett Martin
Senior Editor
Teradata Magazine


EPIC Awards 2013 Teradata – nominate your company

Okay, this probably isn't new to you: Teradata-we love to data and all the amazing things you can do with it. Analysis and data storage is a big thing for almost all companies worldwide. And while many are already doing quite well, some do it even better. A crisp example is McCain Foods Limited. The Canadian producer of oven-ready frozen products with numerous facilities and offices around the world is able to provide all its employees not only with data but with intelligent and actionable information acquired by advanced data analysis – local as well as on corporate level. In this way, the data company departments and employees in each site no longer need to gather and extract data to provide to their clients or colleagues, because they have direct access to the information they need through reports and dashboards.

Since Teradata is so excited about data analysis, we are always happy to hear about these success stories from our customers and, of course, also from our partners who help make this happen. Therefore we honored McCain and 14 other companies with the epic Awards 2012 Teradata. And since we have not changed our opinion last year, now we're ringing the Bell for the 2013 Edition. The winners can be significantly enhanced that Teradata customers their business objectives by implementing analytical solutions best-in-class, or Teradata partners companies such as independent Software vendors (ISVs) and system integrators (SIs) that outstanding contributions to Teradata and its customers, and who have demonstrated their commitment to the success of business through the Teradata platform.

So if your team, too, has created a Teradata-based solution that deserves international recognition, appoint the company until July 19, 2013. Winners will be announced at the awards ceremony of Teradata EPIC held during Teradata Partners Conference, User Group Expo & this October in Dallas. Look forward to many great and inspiring entries!


5 steps to making BI more intelligent in big data Analytics

Translate Request has too much data
Parameter name: request
AppId is over the quota

In recent months, I met with the Business Intelligence (BI) teams in different countries to discuss Big Data Analytics. What transpired from the meetings is clear lack of awareness of what Big Data Analytics can do for the BI team and how Big Data Analytics fit within the enterprise data warehousing (EDW). As ambassadors to their business community, BI teams have the opportunity to be at the forefront of new technology trends and be able to articulate the value of Big Data Analytics to business stakeholders.

The Big Data trend has been here for a while and there is no shortage of publically available resources on the subject. However, many of these sources do not seem to allow the audience “to see the wood for the trees”! Also, storage vendors such as Dell and EMC are not helping the situation either by confusing the BI teams with low cost storage aspects in preference over business value of Big Data Analytics. I believe that paying attention to business value of Big Data Analytics will make the BI team not only look smarter in front of the business stakeholders but also make it easier to get funding for Big Data Analytics projects which many of the BI teams are considering as an opportunity to advance their career ambition.  

In the next few paragraphs below I have described in a few steps some essentials of Big Data Analytics in technical terms and how they fit into the enterprise data warehousing ecosystem as unified data architecture (UDA) that supports the next era of analytics and business insights. Many of these examples are related to the airline industry but the principles equally apply  to any industry.   

Step 1: Getting to know the essentials of Big Data

First step to Big Data Analytics is to understand the new technology capabilities such as Map Reduce, Hadoop, SQL-Map Reduce (SQL-MR) and how they fit within the enterprise ecosystem. It is also important to understand the differences in approaches between traditional EDW and Big Data Analytics design, development and implementation processes.

For instance, if you are in the airline industry, you would have designed the enterprise data warehouse for transactional reporting and analysis with structured stable schema and normalised data model.

You probably stored unstructured data such as ticket image, recorded audio conversations with customer service agent and ticketing / fare rules in the database as BLOB (Binary Long Object). Furthermore, you may have found it difficult to write in declarative SQL language the complex business rules such as financial settlements of inter-line agreement from code share arrangements, open jaw fare rules, say between Zone 1 and Zone 3, and business rules for fuel optimisation; so, you may have resorted to procedural languages such as user defined functions (UDF).

But UDFs have numerous limitations that MapReduce, more specifically, SQL-MapReduce (SQL-MR) makes it easy to solve while allowing for high performance parallel processing.

- What if you are able to use MapReduce API (Application Programming Interface) through which you can implement a UDF in the language of your choice?
- What if this approach allows maximum flexibility through polymorphism by dynamically allowing determination of input and output schema at query plan-time based on available information?  
- What if it increases reusability by enabling inputs with many different schemas or with different user-specified parameters?
- Further, what if, SQL-MR functions can be leveraged by any BI tools that you are familiar with?

As you can guess, SQL-MapReduce (SQL-MR) overcomes the limitations of UDF by leveraging the power of SQL to enable Big Data Analytics by performing relational operations efficiently while leaving non-relational tasks to procedural MapReduce functions.

You will see some examples of this later but, first and foremost, what is MapReduce? MapReduce is a parallel programming framework invented by Google and popularised by Yahoo!.MapReduce enables parallelism for non-relational data. By making parallel programming easier, MapReduce creates a new category of tools that allows BI teams to tackle Big Data problems that were previously challenging to implement. It should be noted that unlike the core competency for parallelism of the Teradata’s relational database technology over the last 30 years, MapReduce is not a database technology. Instead, MapReduce relies on file system called Hadoop Distributed File System (HDFS). Both MapReduce and HDFS are the open source versions of the Big Data technologies.

Step 2: “Hello World” welcomes you to the world of MapReduce with “Word Count”

Let’s take look at how Hadoop MapReduce works! When you wrote your first program you may have tested it to make sure “Hello World” works by printing / displaying the words correctly. With MapReduce, you will most likely to be testing Word Counts in your MapReduce program.

A MapReduce (MR) program essentially performs a group-by-aggregation in parallel over a cluster of machines. A programmer provides a map function that dictates how the grouping is performed, and a reduce function that performs the aggregation.

Let’s say that you want to create a Book Index from Big Data Analytics for Dummies. When writing your MR program, you will provide a map function that dictates how the grouping is performed on paragraphs containing words, and a reduce function that performs the aggregation of the words to produce the book index. The MapReduce framework will assume responsibility to distribute the Map program to the cluster nodes where parts of the book is located, processed, and output to intermediate files.  The output of the map processing phase is a collection of key-value pairs written to intermediate flat files. The output of the reduce phase is a collection of smaller files containing summarized data. The key-value pairs of words above are reduced to aggregates that produce the book index.

Because the MR program runs in parallel you will notice tremendous increase in reading (e.g. grouping of paragraphs from Big Data Analytics for Dummies) and processing speed (e.g. summarising and aggregation of key-value pairs) that would impress even Johnny 5

Creating an index list of words and counts from Big Data Analytics for Dummies may not be terribly interesting or useful for you but, the capability of such key-value pair generation from any multi-structured data sources can be put to analytical use by creating a set of useful dimensions and measures that the BI teams are familiar with that can be integrated with data in the EDW. Perhaps, instead of creating the Book Index, you may choose to create an index of all flight numbers, origins and destinations from the booklet of an airline time table which you may find more useful in the airline business.

Step 3: Putting MapReduce to solve business problems

Long gone are the days of GSA’s (General Sales Agents) enjoying hefty sales commissions from the airlines! The market is highly competitive and organisations are looking for best decision possible from analytics. With ubiquitous availability and convenience offered by broadband connections, customers’ attitudes and behaviours are rapidly changing. Now customers are looking for best travel and holiday packages online. They are also listening to the opinions of their friends and public remarks on social network forums. Interestingly, this is also instrumental in rapid rate at which huge volumes of data is generated, opening up the need for Big Data technologies.

What if we could utilise the multi-structured data from click streams, Facebook, Twitter data for improving business performance? What if we are able to extract the IP Address from the click stream data and correlate with the profile of the customer from EDW along with best fare for the Round The World Travel deal that the customer is looking for? What if we are able to extract the sentiment of the customer’s travel experience from Twitter and Facebook data and use the positive / negative experience to provide the Next Best Offer during the customer’s next inbound call to the agent or online visit?

Step 4: Integrating unstructured and structured data for Big Data Analytics

Here we consider how the integration of multi-structured data in MapReduce and structured data in EDW can be used for improving business outcome. You will see that instead of the MapReduce program for Word Count that you wrote previously, you will write a new MapReduce program to extract the key-value pairs for IP Address, flight deals and any other relevant information from the Apache Weblog files where the customer’s online interaction is recorded. In a later paragraph I will describe how the MapReduce program you wrote is invoked in SQL by means of SQL-MR or better still how you can leverage several pre-built functions (without having to write your own MapReduce program) using SQL-MR. For now, let’s assume the extracted data from MapReduce is created as a table in the EDW. The extracted IP Address can then be joined with Master Reference in the EDW to identify the User ID which is then used to match the frequency of online visits and lifetime value of the customer etc.

Step 5: Flying high with SQL-MR (SQL-MapReduce)!

While MapReduce is good for solving Big Data problems it can cause a number of bottlenecks, including the requirements to write software for answering new business questions. Trying to exploit data from HDFS through Apache Hive is another story; let’s not even go there! SQL-MapReduce (SQL-MR) on the other hand helps to reduce the bottleneck of MapReduce by allowing maximum flexibility through polymorphism (by dynamically allowing determination of input and output schema at query plan-time based on available information). It allows reusability by enabling inputs with many different schemas or with different user-specified parameters. More importantly, you can exploit all types of Big Data using the BI tools that you and your business analysts are familiar with.

Here you will see examples of how you may use the SQL-MR function text_parser (with just a few lines of code) to solve the word count problem / creation of a Book Index for Big Data Analytics for Dummies / extraction of IP Addresses from online clickstream data. You will notice reusability of the SQL-MR function that enables inputs with many different schemas and with different user-specified parameters to create output schema at query time.

You will find that SQL-MapReduce (SQL-MR) provides excellent framework for jump starting Big Data Analytics projects with substantial benefits, viz. 3 times faster in development efficiencies, 5 times faster in discovery and 35 times faster with analytics. My colleague, Ross Farrelly, demonstrates with an example of how to reduce the pain of MapReduce ,which will be of interest to you as well. You can see how SQL-MR provides an excellent framework for customising / developing SQL-MR functions easily with an Integrated Development Environment (IDE).

Exploring and discovering value from Big Data is how you will divide and conquer the volume, velocity, variety and complexity characteristics of Big Data. You will also gain great benefits from seamless integration of the different Big Data technologies as a Unified Data Architecture (UDA) to provide advanced analytics.

Here is another business use case that the SQL-MR functions nPath and GraphGen solve elegantly and efficiently compared to either SQL or MapReduce. Try writing this in SQL or MapReduce and notice the difference! The business problem that we are trying to solve is related to identifying the more frequent customer activities or sequence of events that lead to disloyalty.

You can see from the chart below that of all the different channels that customers use to buy airline tickets, the online channel leads to unsuccessful ticket sale. By visualising the sequence of all customer events you will notice that the Online Payment page is where abandonment occurs (i.e. noticeable from the thick purple curved line that indicates the strength of the path segment) which provides insights about the issues with the online channel. By taking corrective actions ahead of the online payment event step you will create customer loyalty and growth in sales.      

Here is the SQL-MR code for the above visualisation of ticket purchase path analysis:

If you are all set and ready to go on your first class journey with Big Data Analytics then, check-in here .While ‘inflight’, treat yourself with ‘cocktail’ of analytical functions from a wide ranging selection of 70+ pre-built SQL-MR functions .

Travel smart, impress your accompanying business stakeholder, double your rewards from analytical outcomes and enjoy your journey with Big Data Analytics! By the way, don’t forget to drop me a note, if you found this useful! Bon voyage!

Sundara Raman is a Senior Communications Industry Consultant at Teradata ANZ. He has 30 years of experience in the telecommunications industry that spans fixed line, mobile, broadband and Pay TV sectors. At Teradata, Sundara specialises in Business Value Consulting and business intelligence solutions for communication service providers.


Five Ways Big Summer Date Will Impact Travel

AppId is over the quota

Memorial Day è proprio dietro l'angolo e con esso arriva l'inizio ufficiale dell'estate. Per la maggior parte di noi – se sei sette o settanta-anni – la parola "estate" è sinonima di "vacanza", e le vacanze, naturalmente, coinvolgono viaggi.

Infatti, secondo recenti risultati dell'indagine da TripAdvisor, circa un terzo (30 per cento) degli Stati Uniti intervistati intervistati stanno progettando di viaggio questo Memorial Day weekend. Quasi nove su dieci (86 per cento) stanno progettando un viaggio di piacere quest'estate. Che è un aumento del 6% e il 7 per cento, rispettivamente, rispetto ai risultati dell'indagine di TripAdvisor dall'anno scorso.

Che cosa viene in mente quando si pensa a viaggi estate? Si tratta di spiagge? Sight-seeing? Parchi nazionali, forse?

Non io. Forse non venire come una sorpresa per chi mi conosce bene, ma quando penso a un viaggio in questi giorni, in genere finisco per ruminanti su una cosa: grandi quantità di dati... o, più specificamente, grandi quantità di dati analytics e come piattaforme e applicazioni di oggi stanno facendo viaggiare più facilmente in tutto il mondo.

Proprio come ogni altro settore, i viaggi e turismo industria ora affronta la sfida di acquisizione, memorizzazione e trovare valore in enormi quantità di dati. Comprensioni di affari – quando sono velocemente accessibili e costruito sull'infrastruttura dati destra – pu? trasformare gut-sentire le decisioni in quelli intelligenti, basate sui fatti.

Presso Teradata, sappiamo che quando le imprese di viaggi e turismo sceglie di integrare i processi operativi e strategici di business e intelligenza sociale, iniziano a guida soddisfazione del consumatore e la redditività. Perché? Perché le intuizioni di grandi quantità di dati direttamente l'impatto:

i costi di viaggio. Compagnie aeree, agenzie di viaggi, Alberghi e relativi servizi operano in un contesto iper-competitivo dove rapide risposte alla domanda dei consumatori, concorrenza e condizioni ambientali possono fare la differenza tra i profitti e le perdite nelle operazioni.

Disponibilità di biglietti.Come con i prezzi dei biglietti, attento monitoraggio della domanda dei consumatori, concorrenza e le condizioni ambientali consentono viaggi e turismo le imprese ad adattarsi a un mercato liquido.

L' esperienza del cliente. Intuizioni di grandi quantità di dati guida viaggi e turismo le imprese migliorano l'esperienza del cliente offrendo spunti critici per le linee del fronte solo quando necessario – anche in tempo reale.

Lealtà. Viaggi e turismo le imprese che si avvalgono di intuizioni di grandi quantità di dati costruire fedeltà con offerte progettate da particolari "nascosti" nei dati del cliente.

Un vantaggio competitivo.Imprese viaggi e turismo utilizzano grandi quantità di dati analytics per trovare i driver di riduzione di costo in operazioni irregolari, l'utilizzo di produttività, carburante e materiali di lavoro, ecc.

Vuole un esempio? Basta guardare a come colleghi di eCircle Teradata utilizzato dati e applicazioni per mantenere Londra spostando l'estate scorsa durante il 2012 Olimpiadi e Paraolimpiadi. A nome del suo cliente Transport for London (TfL), Teradata eCircle implementato selezione dati sofisticati per creare email mirate che potrebbero essere utilizzate per comunicare direttamente con i pendolari quotidiani, inviando loro un sito speciale, dove essi potevano piano loro viaggi per evitare Viaggi hotspot. A seguito di questi messaggi di posta elettronica, 35 per cento dei londinesi ha apportato modifiche alle loro rotte, riduzione del traffico nel centro di Londra da circa il 15% durante i giochi.

Mi aspetto analisi di grandi quantità di dati per avere un impatto sempre più significativo sul viaggio nel prossimo futuro. Secondo il World Travel & turismo Consiglio (WTTC), l'industria dei viaggi e del turismo dovrebbe crescere in media del 4 per cento all'anno nel prossimo decennio. Notevolmente, che sarà pari al 10 per cento del PIL mondiale, o circa US$ 10 trilioni. Il WTTC anticipa anche che da 2022, viaggi e turismo rappresenterà per 328 milioni posti di lavoro, o in ogni 10 posti sul pianeta. Con tutte queste persone "on the go" – e tutte le informazioni digitali che siete tenuti a generare – rende perfetto senso per le imprese di viaggi e turismo iniziare sfruttando Google analytics di grandi quantità di dati per migliorare l'esperienza del cliente... e non solo per il Memorial Day o le vacanze estive, ma per ogni giorno dell'anno.

-Darryl


See the big picture

Apache Hadoop ™ ® has gained popularity for its ability to process large data sets using the simple programming. As part of Teradata ® Unified Data Architecture ™, Hadoop lays the foundations of analytical data to quickly and easily store and access large volumes of data.

In a nutshell, Arlene Zaima, program manager for Teradata integrated Analytics solutions, says that Hadoop data contain "hidden jewels" that can be integrated with the information in the data warehouse to learn more. Teradata did find those gems easier in two ways. Smart charger allows bi-directional data transfer between Hadoop and Teradata and a new interface named Teradata SQL query-H ™ helps analysts use standard interfaces to access data in Hadoop.

Enabled by this self-service environment, analysts have the luxury to enrich information from an integrated data warehouse with highly valuable data from Hadoop to find new business value, from large data analysis.

Brett Martin
Senior Editor
Teradata Magazine


Volare Airlines superiore con Google Analytics

AppId is over the quota

Airlines, plagued with sluggish profits and frustrated customers, realize the growing importance of data analytics as customers vent their frustration across the social media universe. Dr. Nawal Taneja, an aviation industry expert and author, says carriers are facing both a challenge and an opportunity to create new customer experiences and increase profitability. In response, airlines are working ever harder to not only to better understand what makes happy and loyal passengers but also which ones are the most profitable.

The solution is better use of data and technology, such as data warehousing and analytics, that allows companies to transition from being service providers to providing solutions; to shift from a fee-based to a value-based business model. Now when they look toward the end of the runway, industry leaders are getting a glimpse of a future in which they'll be able to choose passengers, rather than the other way around.

Brett Martin
Senior Editor
Teradata Magazine


From the countryside of commitment: Teradata helps CPG marketing as connect with today's consumer

Consumer Centricity is the new "black" between a retailer of consumer goods.  Knowing who is buying your products when and where – the purchase path – is fast becoming table stakes for marketing competing for the attention of walletshare and poor consumers.  It is an easy task, considering the diversity of points of contact and data-dependencies illustrated by Don Scheibenreif Gartner Customer 360 Summit recently:

 

 Consumer Insight

Sometimes we start from the ruler the silo and often outside data consumer generated by numerous agencies employed by digital, brand and shopper marketing.  This data – and analysis of data derived from it – can inform intra-and cross-brand marketing aligned to key decision points along the path to purchase, through channels such as the web, email, social media and mobile.  Is a powerful concept that we talked recently to the Teradata Conference APEX, and relies on technology management leadership and commitment.

This requires not only a change in the mentality around the consumer data (which is good), it takes a commitment to ongoing commitment of consumers that differs from seasonal and promotional marketing-driven.  This type of dialog with consumers launches the switch on the product-driven messages to those that instead of serving the consumer so friendly and considerate.  Doing this well depends on a living record of every consumer and shopper who does business with your brand.

Before embracing this approach, some struggle with a perceived disconnect between marketing budgets shrinking or static and the need to increase the level of interaction with consumers and buyers.  "Out of sight out of mind" never applied more than brands that today given the saturation of messages aimed at consumers online or in-store.  Thus it requires a real commitment (Executive) for the commitment (albeit respectful permissions preferences and communications).

Marketing Resource Management

For this reason, other consumer goods organizations approach this opportunity operationally focused on the efficiency of the marketing organization, people and processes.  Marketing Resource Management or MRM, is more easily explainable as an ERP system for marketing ".  Garter defines as:

"... .a set of processes and features designed to improve a company's ability to orchestrate and optimize resources, internal and external marketing. "

As the functionality for data management and engagement, Teradata solution for MRM is also ruling class (the MRM Magic Quadrant shown here).

The three key functional areas of MRM – spend management, workflow and asset management-bring automation, supervision and simplification for burdensome manual processes involved in working with and paying agencies, develop and launch campaigns and leveraging the content and activities developed over time by the entire marketing organization.  The savings can be significant – as in tens of millions.

Increased efficiency leads to newfound agility.  Understand what works and what doesn't-faster than before-then deploy more resources for winning marketing programs help to adapt to the digital world real time consumers operate today.

More efficient resource allocation supports investment in a continuous engagement strategy aimed at buying test, repeat and referral marketing higher performance.  If these efforts can still be mapped to consumer and unified profile data from shopper, minus the MRM facilitates improved performance within a brand or a marketing channel as an integrated strategy data comes online.

Ultimately, the campaigns are no longer defined or constrained by long internal processes.  Instead they become engagement engines create permanent consumer connections that inform product development, marketing, sales and distribution and supply.  That is the data driven marketing future that all consumer products companies should aspire to.

Bassett GiB


Perché le organizzazioni dovrebbero avere una capacità di individuazione dei dati

AppId is over the quota
AppId is over the quota

I have been struggling to reconcile two different thoughts over the last few months and watching a video recently forced me to think about this again. There seems to be a catch 22 between finding value in new data and having the tools and mechanisms justified by the value to find the value. Firstly I see a lot of organisations struggling to get their analytics initiatives underway and sustainable, there are many articles on the web about this. The second is why should organisations have a data discovery capability, is this a marketing term or is there real value in it?

Something occurred to me recently. We were discussing the value chain of analytics, or something analogous to a value chain for analytics in large organisations and exploring how different pieces of data have different value and how this could be used in a BI Centre of Excellence to engage with users. The question was how do you decide when to put new data into the warehouse and what data remains outside the warehouse?

What occurred to me was that a lot of the value of much of the data we were considering had already been established or simply assumed (someone asked “loudly” enough for it and they got it). It was being stored and managed in a data warehouse, it was being accessed by users using various toolsets and although critical to the management of the business is fundamentally operational in nature. The value of a particular piece of data had been established previously and subsequently significant investment had gone into that piece of data to get it into the warehouse on an ongoing basis. That piece of data was now being used to manage and change the business so it was creating impact.

A short digression - because the value of data is determined by what it can be used to accomplish. Data has no intrinsic value, in fact even insight or actionable insight has no value unless it is put into action or changes something. Unless you change the business, a process or an offering in some way the data is merely interesting not important. Analytics teams are often cut off from the business and the ability to impact the business in a meaningful way.

The question became how do you get new pieces of data added to the operational store, the warehouse, because that is how it gets used and therefore that’s how it becomes valuable when you don’t know it is valuable yet? You have to know it is worth something before you integrate it into the data warehouse because there is a significant investment in integrating a piece of data into the warehouse. You have to know it is worth something before you invest in operationalising it. Seems a little confusing because you have to know it is valuable before you make it available to make it valuable, in which case you don’t actually know it is valuable.

Knowing that a piece of data is worth something is also important in justifying analytics teams and getting analytics initiatives up and going. Until you know something has value for a fact, there is no way to build a business case that funds the building of an analytics team. I have seen a number of organisations try address this problem by employing a small team, sometimes an individual, to be the analytics team. Problem solved as it makes it an opex problem, something the business can fund month to month. But this approach struggles to gain momentum, struggles to justify value and is very difficult to build a long term business case around.

These teams are either accommodated in IT where they can often get the data but not the business question or they are housed in the business and struggle to get the data. In addition to this much of the value in analytics comes from looking across the business. Silos tend to be quite good at optimising their narrow world, the value comes from optimising across silos.

This is when a discovery platform may be valuable. If you can provide easily accessible analytical ‘sandboxes’ that are both easy to use and can access all types of data you change the problem from being one of funding to one of testing the findings. Currently discovering the value in a piece of data is hard. There is no single technology that addresses all requirements requiring users to employ multiple tools. HDFS and Hadoop is attracting a lot of interest but is not the easiest to use, especially for business users. SQL is positioned as more of a business language but does not access all data structures. So what do you do if you want to find valuable data but are skills constrained?

Someone mentioned to me one of the ways this can be done is using an agile analytics methodology or approach. In my experience of agile, admittedly mostly in software development, “agile” has often become an excuse for no documentation, no objective or no accountability so I have been a little sceptical about anything labelled ‘agile’. Admittedly it has come a long way since I first bumped into agile so decided to test my bias and looked through some articles on an internal website about an analytical agility capability. Don’t get me wrong, I buy the drivers for agility – very short term deliverables, direct business involvement, the output is more important than the governance or methodology so I would like it to work.  This was also about analytic agility not so much an agile methodology.

The only way to identify new items of valuable data is to experiment and test. Something I have been hearing is “fail fast” which sounds bad but really means test lots of things, do it properly and determine which ones are not going to work fast. Take successful experiments and operationalise them fast. This is what a discovery platform can enable. I would like to get other peoples view but this to me is a way to rapidly test things and determine which data long term should be integrated into the data warehouse.

There is the emergence of the Discovery platform, a set of technologies that makes it easier to integrate multiple sources and types of data while providing a uniform mechanism to access them. Namely SQL. They also provide a means to test the insight in a rapid way and thereby prove value before investing in operationalising an insight. You get to test the value in a meaningful way and evaluate the value before having to invest in making it available.

If anyone has a view on either an agile approach to identifying valuable data or how a Discovery platform can help in opreationalising analytics it would be great to hear some views.

Craig Rodger is a senior Pre-sales Consultant with Teradata ANZ focusing on advanced analytics. He has spent 20 years in the IT industry working on how to get value out of systems rather than getting things into them. Having been a member of a number of executive management teams in software, technology and consulting companies and helping build a number of technology business ventures he joined an advanced analytics vendor.


La realtà di grandi quantità di dati Analytics

AppId is over the quota

"We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run," said Roy Amara, former president of the California-based Institute for the Future. How right he was. Think about the old days of the World Wide Web. During the 1990s it was widely believed that the Internet would revolutionize our economy suddenly. And related technological advance was expected to boost profits in the near future. This "irrational exuberance" (Alan Greenspan) then led to the dot-com bubble at the turn of the century. Meanwhile, the Web has changed business dramatically and those Internet-based corporations that managed to survive the turmoil began to see profits or even rose to become industry-dominating enterprises.

The entire story is what Gartner calls a hype cycle. The consulting company has been characterizing what typically happens to new technologies since 1995. And as I recently mentioned, the phenomenon of big data – according to Gartner's Report for 2012 – is about to reach its top of the hype-graph. If that turned out to be true, big date would soon slip towards the trough of disillusionment. Bang! You wouldn't hear much about big data technologies for a while. They would only slowly recover and would finally level at a normal stage, embedded within many other established technologies.

Newsflash! A recent report published by the Business Application Research Center (BARC) in Würzburg, Germany, shows you big date is already transforming companies. The Big Data survey was conducted in major European software markets and answered by 274 decision makers in IT and other departments. The answers show that – beyond all the talk about it – big date has become a reality within European companies, helping managers to gain deep insights into markets and customer needs.

While 14 percent of the companies have already developed a detailed big data strategy, 75 percent are aware of the new possibilities arising from big data technologies. However, most companies still face serious challenges in monetizing big date. Lack of expertise is one of the main obstacles, but you can clearly see the trend: Big date has long since moved into the slope of enlightenment. We are there to help a broad range of users benefit from big data technologies. And I feel we are getting very close to the final stage.


The next episode: Dr. Dre meets large amounts of data

AppId is over the quota

Combattere!

Come un allume USC, è strano che I cringe quando altri Trojan ben intenzionati strillare il grido di battaglia della scuola.  Ma, la settimana scorsa, invece di sentire la retorica di una marching band nella mia testa al pensiero di ' SC ' combattere la canzone, stavo Feelin ' un po ' più hip-hop. Ho avuto uno spirito affine in Dr. Dre.

Cuz che (come direbbe Dre) famosa star hip-hop Dr. Dre e magnate musica Jimmy Iovine ha annunciato una donazione di $70 milioni per l'Università per creare una nuova Accademia per musica, concentrandosi sull'intersezione tra arte, tecnologia, business e innovazione. Il curriculum include Computer Science. Imprenditorialità. Art. Di marketing. Google Analytics.

Tutti noi non potevamo stare a trovare questi incroci un po' più chiaramente?

Questa è la sfida per l'industria media e dell'intrattenimento di oggi – la necessità di trovare tale intersezione di "gut" e "intuizione".  Sono sicuro che questa è la rovina di qualsiasi industria creativa lunga in data odierna driven clima. Ecco perché è difficile contestare la saggezza collettiva di potenze creative che sono state a loro commerci per decenni. Ma, nessuno discute che ci deve essere un interruttore all'ingrosso. Piuttosto, solo alcuni più apprezzamento per l'intersezione.

In una manciata di anni che ho lavorato in analytics – che sono stati preceduti da molti manciate di anni di lavoro in produzione, post-produzione, e digital media – ho visto un vero ramp-up nel ruolo di analytics presso società di media tradizionali e digitali allo stesso modo.  Ma la verità è che ci sono ancora le fazioni.  Se sto parlando a un creatore di contenuti, distributore, editore o MSO, ci sono spesso nei campi: il campo di Google analytics-sono-sopravvalutato contro l'accampamento di Google analytics sono il nostro futuro.  Quei due campi stanno iniziando a incontrarsi nel mezzo – ed è giunto il momento.

A rischio di sembrare eccessivamente profetico, c'è bellezza nell'intersezione tra arte e scienza.  E, che, credo, è la promessa di Google analytics grandi quantità di dati attraverso la catena del valore contenuto. Quando le aziende creative possono integrare ci? che sanno sul loro pubblico, loro contenuto, i canali e loro commercializzazione, si pu? scatenare il valore di intersezione tra arte e scienza.  Qualsiasi quadro google analytics di successo richiede una comprensione dettagliata dell'arte di entrambi.

Cos?, per tutti voi aspiranti artigiani dati là fuori, prendere il cuore! Dr. Dre ha ottenuto la schiena su questo su questo. Combattere!


Seeking: scientist

AppId is over the quota

Un mio amico ha un figlio adolescente che ama computer, videogiochi ed è un esperto di matematica. Quale percorso di carriera di mio amico raccomandando a lui? Scienza di dati! L'adolescente è già in classi statistiche supplementari per ottenere pronto per il college. Speriamo non ci sono più bambini come lui là fuori, perché stiamo andando ad avere bisogno di loro.

Nell'attuale ambiente trovando un dati scienziato pu? essere una lotta. La carenza è radicata nel fatto che solo un gruppo relativamente piccolo di persone eccellere a entrambi i dati avanzati concetti di Google analytics e business che serve per aiutare le imprese a trasformare enormi volumi di dati in intuizioni impugnabile. Nella nostra infografica "Un lavoro ben fatto," Noi esploriamo quello che dicono le parti interessate di trovare questi esperti critici.

E no, Siamo spiacenti, figlio del mio amico non accetta offerte — ancora.

Jennifer Niemel?
Executive Editor
Teradata Magazine


Tips for creating better communication via email

So you've received an email from your boss with the comment "FYI" at the top of the email trail with a subject even more obscure "catch-up with".  So he dutifully goes through the email trail trying to understand the context and significance of why your boss I wanted to read it.  Go through the attachments and try to extrapolate this is relevant and then try to deduce what is exactly the action required of you.  5-10 minutes later you have decided you have read enough and if it were that important, someone will eventually ask you to do something.

Do you feel uncertain.  So you keep a copy of the email, or tag, or put a "read later" folder just in case.  Now multiply this effort with hundreds of emails that you will receive every day and you can see how fast you eat your day.  What is worse, is a vicious circle-forward email to your colleagues and subordinates with his own formulation "FYI".   This is normal lazy and wasteful communication.

If you have already invested time and effort to analyze the communication you received, it would be more productive for the people from going through the same pain of replacement?  Wouldn't it be good to have a simple test that can be used to verify the effectiveness of your communication?  My answer- S C A Q *.  (* derived from the framework communication http://www.barbaraminto.com/by Barbara Minto.)

The acronym stands for situation, complication, question and answer.  Each piece of communication they prepare, if an email, a presentation or a paper proposal should have this contained in the first two or three paragraphs.

So here's how to apply it.:

Situation: my communication sets out the situation clearly so people understand if you apply them or not. This helps filter the people concerned on the issue, saving time and effort.

(S) XYZ SW version 4.3 has been delayed.

Complication: has my communication transmitted what has changed in the situation that creates a sense of urgency for people to pay more attention and be prepared to take some action tomorrow? (C) members involved in pre-sale moto can offer functionality will not be available.

Question: I have framed my question that there is no ambiguity as to what the player should be looking? This prevents the player from second guessing what should be the questions. It also helps to validate the completeness of response that should provide.   (Q) what are the new features in the new release and what is the revised release date?

Answer: you have already framed the issue then you must provide the complete answer to this question. This can be as long as need be.  (A) time and columns are the key new features of the new version. The release date is now Q4 2014.

The entire email now reads:

4.3 SW version XYZ has been delayed. Employees involved in pre-sale moto may be offering functionality will not be available.

What are the new features in the new version and what is the release date changed? And Time columns are the new key features of the new version. The release date is now Q4 2014.  For more details, please read the attached document.

Compares this short but pointless:

Kind regards. Refer to attached.

Now take a look at an email that they were going to send and make a "SCQA" on it!  Send me your before and after release.  Happy to talk at all.

Renato Manongdo is a consultant of Teradata sector ANZ with extensive experience in customer service in financial services, insurance and healthcare industries.


Open Access to Big Data a Major Driver of Value for eBay

AppId is over the quota

At Teradata's Big Data Analytics Summit, held recently in Sydney, Alex Liang of eBay (Director of the offshore Analytics Platform and Delivery) presented on their big data ecosystem. It should be noted that his description has to be taken in the context of eBay – a company in which their business is their website is a marketplace on which they aim to match customer's desires with seller's products. This takes place on a massive scale with a requirement for 99.9 + percent availability.

Nevertheless, despite this challenging environment, eBay is committed to the democratisation of data – that is, making data available to large number of employees to query, predict and experiment. To this end they have a decentralised data management system which allows employees to create a virtual datamart, ingest data, identify a trend or gain an insight, form an hypothesis, design an experiment to test that hypothesis, implement that experiment on the eBay website (via A/B testing), measure the results and undo the changes if necessary – all with a great deal of autonomy.  However, with freedom comes responsibility, so employees using data in this way are also responsible for the results they generate from their data analyses.

Can this approach be applied to other companies? As a general principle it can – the philosophy of allowed a larger number of users access to the valuable data held by company can, if implemented well, lead to outstanding results. The approach of allowing users to run free on the date (within certain well defined limits of course) only reigning them in when they approach those limits, can allow companies to exploit the value of big data in a dramatically improved manner. There is a profound philosophical difference between giving users wide access to data and only place restrictions where needed as opposed to starting with a very limited access and adding to it if and only if there is a compelling business need (somewhat analogous to difference between continental and English common law systems).

Has this philosophy of making data as widely available as possible taken root in Australia? Based on the number of questions asked during the conference about how to restrict access and monitor behaviors I would say we still have a long way to go. Of course there is a need to balance the free access to data with the need for appropriate restrictions and Alex outlined eBay's approach to implementing those restrictions including: permissions, automated monitoring, automated retirement of cold data-marts, productionisation of hot data-marts and the need to pass an exam to get access to Teradata. But it is informative that most questions were about how to manage restrictions rather than on the benefits of open access to data.

Another interesting use of date eBay is the meta-analysis of the queries being submitted. Alex described a program they have to use python to analyze the queries being submitted to Teradata, Singularity and Hadoop. The aim is to identify sub-optimal queries, but also to identify commonly requested information and to develop more efficient ways to deliver this to users. This is an example of a growing trend of data generating data – the data in the warehouse for example indirectly generates or causes users to write queries which themselves then become data which can be analyzed.

Ross Farrelly is the Chief Scientist for Teradata Date ANZ who is responsible for data mining, analytics and advanced modeling projects using the Aster Teradata platform. He is a six sigma black belt and has had many years of experience in a variety of statistical roles.


How to super-power your efforts: learn from Heroes of Analytics

In Hollywood, summer means blockbuster movie ... and with blockbusters like explosions, Car Crash, daredevil stunts and of course superheroes. Summer 2013 is no exception, with titles like the man of steel and The Lone Ranger, the movie audience is for lots of action-packed adventures in the coming months.

Of course, not everyone needs to go to the theater to get their fill of superhero Action. Some of us are fortunate to work in offices where every day there shoulder to shoulder with the superhero – super heroes of data analysis, that is.

These heroes transform large amounts of data from an insurmountable challenge business to actionable, acquired and processed for the revenues of units. As described on the website of heroes of Teradata and SAS Analytics:

"One by one, they emerge from the darkness, snatching the value from the clutches of ambiguity, exposing fraud in a sea of chaos and innovate in the face of statistical impossibility ..."

As you can tell, here at Teradata, we had some fun with the metaphor of the superhero altogether. But that's not to diminish, in any way, the true that Analytics professionals today are committed every day trailblazing work (and a lot of nights, work too).

For example, meet Megavox – otherwise known as Frank Caputo.

As a member of the marketing team at Medibank, Frank used solutions in the Teradata database and SAS to improve the speed of creating campaign from hours to minutes, optimize marketing investments, minimize dependence on external suppliers and more – that led to enormous improvements in the effectiveness of the campaign, together with significant reductions in costs and the substantial increase in revenue.

There is an analytic superhero at your company? Maybe you are one yourself? If so, please consider applying to our Analytics Heroes program. Is your chance to receive hero's fame and respect Analytics, get your own action figure and also be immortalized by a famous comic artist.

Talking about fame and respect, also we are now accepting nominations for awards 2013 Teradata Epic, which recognize our customers and partners for their leadership in implementing data solutions and data analysis.

If your team has created a solution based on a Teradata platform that delivered the bottom-line business value to your company, this is an opportunity to win the recognition they deserve. The application deadline is July 19, 2013, and award winners will be announced and celebrated at the awards ceremony of Teradata EPIC held during Teradata Partners Conference, User Group Expo & this October in Dallas.

How does the superpower google analytics? Apply to Analytics heroes Teradata program and/or an epic Prize for sharing your story ... and who knows, maybe you'll be on your way to saving the world, a campaign based on data at a time.

-Darryl


Saturday, June 15, 2013

3 ways to avoid the summer laziness BDR for MSPs

Summer heat can cause as big a disaster such as a hurricane or tornado?  We turned to backup and disaster recovery specialist and MSP Strata Information Technology, Inc. to find out. President Pete Robbins follows three simple steps to keep your customers in control during the summer heat. You reveal the scoop in this exclusive MSPmentor.

Strata Information Technology, Inc. President Pete Robbins said laziness can kick when disasters strike only occasionally.
Summer heat can cause as big a disaster such as a hurricane or tornado?  We turned to backup and disaster recovery specialist and MSP Strata Information Technology, Inc. to find out. President Pete Robbins follows three simple steps to keep your customers in control during the summer heat. You reveal the scoop in this exclusive MSPmentor.
Robbins suggested to MSPmentor that MSPs located in an area that is less likely to be affected by a natural disaster, it is still important to stay focused and excited.
His company is located in Los Angeles, California, area, according to Robbins, who rarely attended, if not all, of any calamity since 1994. Since this is the case, Robbins noted "a bit of laziness".
To overcome these drawbacks, Robbins plans customers of your company for the following: disasters
Meeting with customers every year to discuss business continuity(BC) -understand how clients plan to keep the business running during the disaster, including disasters caused not by nature. His company reviews the plan to close any holes. If customers don't have a plan, or if they are new, Robbins works with them to prepare one;Budget and implement changes -changes may be necessary to provide BDR solutions or plans of BC. These alterations must be properly budgeted. Assist customers by identifying the costs. Help implement changes in your customer; andcustomer Test plans -need to be tested plans for areas that are still at risk. Don't let mother nature control you. Take the summer to test, test and test plans.
How to keep your customers from falling to the heat of summer? How often you review BDR solutions with their plans or BC?

Next generation MSPs: what it will look like?

MSP IBM Summit will feature Actifio, symmetry, TW Ventures and Rackforce on a managed services provider. Where you can benefit from MSPs generation cloud services? Here's a preview.

Former MSP Tommy Wald is now an angel investor and Advisor to technology.
Next-generation MSPs (managed service provider) as you? The MSP Summit of IBM, this week in Las Vegas, is set to explore this question. I'll be on hand to moderate a panel with four vastly different MSPs. And each of those MSPs will share some clues about where the industry is going next.
They include:
Take a closer look at that list and you will come to some clear conclusions:
MSP definitions are constantly changing and expanding. The classic MSP focused on SMB remote monitoring is now a commodity story.Cloud services are no longer an opportunity "futura". They are now a reality. Sophisticated MSPs with deep application expertise can do more than "resell" Enterprise cloud applications. Symmetry is revealed.IBM wants to collaborate with MSPs in multiple ways-convergent offer hardware for MSP data centers allowing MSPs connect IBM SmartCloud services. If you're launching an MSP in 2013 or 2014, the business plan will have a noticeably different look vs. a plan MSP of 2005 or even 2010.
These are some of my opinions pre-event. The MSP is Summit Panel Tuesday, June 11. I will be sure to offer a thorough recap of panel discussion once the wraps.

BDR 101 maintenance apparatus for Datto partners

With summer around the corner, MSPmentor wanted to discover how MSPs could easily monitor the backup and disaster recovery (BDR) without making it seem like an additional burden, so we reached for disaster recovery (DR) and business continuity (IBC) vendor solutions Given for some answers.

Maintenance of backup and disaster recovery (BDR) of the apparatus is a task that dovetails nicely with the main mission of the managed service provider (MSP) of monitoring and managing the customer's infrastructure. But with summer around the corner, MSPmentor wanted to discover how MSPs could simplify the process. So we reached for disaster recovery (DR) and business continuity (IBC) vendor solutions Given for some answers. How can you keep MSPs using appliances efficiently Given? We reveal the answer.

Datto Sales Engineer Dan Ciccone spoke briefly via e-mail with you to provide some MSPmentor tips and tricks within, pointing out the obvious of MSPs.

"The best way to keep your appliance is Given to keep an eye on what is happening with every device in your fleet," he said.

He, however, offer the following advantages for MSPs offering customers equipment Given, revealing how the centralized management console the company in partner portal can be an effective way to maintain the appliances:

Make sure that the last backup for each device was successful;

Verify that backups are recorded and can run in virtual machines. This can be done through a verification function of screenshots is visible in the management console;

Access to any device in your remote fleet to alleviate any problems on the end user's location; and

Ping with warning device. If there is a problem with the hardware, you will receive a text message or an email regarding the error.

Instead of waiting for something to happen, be proactive monitoring and maintenance of your appliances Given. Access the centralized management console to take advantage of the company's tool for maintenance of equipment. What features consume more?