Professional Football is a unique business in that its most important assets are put in harm’s way as a function of their job. The NFL has made progress in terms of protecting players from the hazards of playing football but the day-to-day trauma that is sustained on the field is often missed. Due to the many technological breakthroughs achieved by the wireless industry, a new world of wearable physiological status monitoring sensors exist.
Recently, I had the opportunity to interview half dozen CIOs and half dozen CFOs. Kind of like a marriage therapist, I got to see each party’s story about the relationship. CFOs, in particular, felt that the quality of the relationship could impact their businesses’ success.
Office 365 is a great cloud service that everyone is talking about. If you work in IT, you have probably thought about trying it, but you are not that familiar with how user provisioning and management is handled and the different deployment methods.
Some of the real power of Microsoft Azure is in hybrid architectures. Not only can you combine the power of PaaS and IaaS resources where it makes sense, you can also add these resources as an extension of your corporate environment. Microsoft recently added a new method for connecting Azure Web Sites and Azure Mobile Services to on-premises resources with a preview feature called Hybrid Connection.
Multi-Factor Authentication (MFA) is an added level of security over your familiar username and password. In the MFA model, after you provide your username and password, you are also asked to provide another verification code. That code is randomly generated and sent to you at the time of the login transaction. The code can also be generated by an application on your smartphone, PC, or a small device you carry with you – also known as a “fob.”
What is a major goal of any OBIEE implementation? There is no single answer. However, IT managers and business stakeholders are almost always concerned with getting the most return on the BI investment and improving company’s bottom line. There are ways that an OBIEE implementation can cut costs, whether used in a proof-of-concept scenario or as a full-scale corporate-wide roll-out.
Microsoft added functionality to easily convert VMware VMs and migrate them to a Microsoft Azure account using a simple wizard. Microsoft Virtual Machine Converter (MVMC) 2.0 enables this functionality and more, paving the way for VMware administrators to give Azure a try.
Have you had a system bug drive you up the wall, across the ceiling and then back down again? Have you stared at a screen over and over hoping to find the cause of your angst? If you answered “yes” to either of these questions, you are going to easily understand what I just went through when cloning some systems on AWS.
What is the biggest challenge with integrating the Cloud in your backup architecture? I would argue that restore times could be the biggest hurdle, especially when we are talking about multiple terabytes of data. Cloud storage is a great backup target for those incremental changes that can trickle up to the Cloud each day. It isn’t great for those large restore operations that are sometimes a necessity.
The OBIEE challenge for BI managers is deciding whether or not to upgrade OBIEE and, if so, when to do it. It is a matter of balance between keeping the existing reporting environment up-to-date and minding other factors such as budget, testing effort, risk mitigation, and expertise on hand.
Windows Azure Backup can help you backup and protect your data in your on-premise data center or cloud environment. This post outlines five possibilities for a Microsoft Azure Hybrid backup architecture.
In this final post, let’s take a look at how you can fully embrace and take advantage of cloud computing. Once your app is up on AWS, you have laid the foundation that will allow you to take advantage of modern cloud technology. I tend to look at cloud in two ways, the first being an application that runs in the cloud, the second being an app that is designed for the cloud.
Over the past few weeks, we have outlined the criteria for selecting a candidate app to move to the cloud and we reviewed the deployment process. Now, it is time to put things into action. When it comes to migrating, there is not even remotely close to a one-size-fits-all solution for moving app data and users to the cloud.
IDC’s and other analyst reports indicate that e-mail and related applications are the most likely platforms to move to the public cloud. But while interest is high to make the move, there is still a question of when to do it. Our clients often ask, “When should we evaluate moving to 365?” To answer this question, we suggest using an event-driven evaluation.
In my post last week, we discussed key criteria for selecting an app to move to the cloud on AWS. Once you have that app selected, it is time to put things into motion.
This week, we will explore how to get large amounts of data in or out of Windows Azure Blob storage. Microsoft recently added Windows Azure Import/Export service that enables customers to use hard disks to move data in and out of Microsoft datacenters. Microsoft uses its high-speed secure network to transfer the data in or out of your storage account.
Over the next few weeks, I am going to walk through the process of moving your first app to AWS. The goal here is to lay out a plan that introduces as little risk as possible to ensure a clean migration. We want to obtain buy-in and instill confidence across the organization that will lead to additional cloud projects that leverage newer cloud-specific technologies.
The Spring thaw doesn’t seem to be on the way. We can only hope the saying, ”March comes in like a lion and goes out like a lamb” is true this year. As we dig out in the Northeast and look forward to a fresh start with the arrival of Spring, it is also a good time to think about making a fresh plan to protect company data. And, coincidentally, March 31st happens to be World Backup Day — a great reminder to implement this fresh start.
Cloud-based Software as a Service (SaaS) applications and services such as Salesforce.com (SFDC) are now common in the business world. They offer quick solutions and are easily adaptable to business users. However, this ease and speed of customization presents challenges.
Windows Azure Storage has some great features that make it very flexible, giving you lots of options for designing solutions. I will share some of those features and explain how they can be used.
Before you jump in and start building things in Windows Azure, there are a few concepts that will make life easier for you if you understand them up front. These topics are in the context of infrastructure as a service (IaaS) but some of them apply to platform as a service (PaaS) as well.
Check out what Steve Bennett, CEO of Symantec – a valued Corporate Technologies partner – has to say about the security challenges of a cloud world and what Symantec is doing to protect the information of our mutual customers.
There is only one thing worse than dealing with an unexpected OBIEE emergency: dealing with it unprepared. Yet having a good risk management plan in place is something many BI managers miss or delay, thinking an emergency is not likely to happen. I hope this blog post helps you to realize the importance of drafting your OBIEE risk mitigation policy and collecting a list of items you will need in case of an emergency.
Having recently moved a small shop to Office 365 (62 Days to the Cloud), I would like to share a few observations about what we learned so that you can prepare for the best experience. This is not a complete set of tips, but there are a few here that I suspect will save you some time and frustration.
Whether you have a new or existing project, at some point you may have to deal with patching your OBIEE installation. Even if that is something handled by your infrastructure team, it is good for you to understand the impact as well as implications.
In this blog post, I will describe two common issues faced when converting legacy reports (such as Excel, Access, or other homegrown solutions) into OBIEE. Typically, these projects involve two types of stakeholders: customers who are used to certain user interface or reporting formatting, and builders (normally within an IT department) who might know OBIEE but are not familiar with the legacy system.
Our young enterprise is operating now with Amazon Web Services (AWS) instances in a Virtual Private Cloud connected to the corporate Local Area Network. We are doing this via the Internet with a Virtual Private Network connection. All collaboration and messaging is served by Office 365. Production is ramping and the business continues to grow. So, what’s next?
It is easy to get stuck in the MUD (multi user development) with OBIEE. I wish I had a better name for it (such as MDE for multi development environment or MODE for multi-user OBIEE development environment), but I don’t. It’s MUD. The good news is: it doesn’t have be messy. With a strategy in place, you can navigate the murky waters with ease.
In enterprise environments, database migrations happen. A few times during such a situation, I wanted to change the existing OBIEE’s repository tables (created by RCU) to a different DB system. In looking through different files to see how to make that change, it seems that there is no supported way of doing this. This is because the files containing the setting are spread out through the system, and some of the files might exist inside jar files.
In the past twenty years or so, there have been only a handful of technological innovations that were so revolutionary that news of them rippled through the industry. Not since the advent of the World Wide Web has there been something as innovative as the current cloud computing wave, led by Amazon Web Services (AWS).
OBIEE administration is frequently an avoided topic. However, I believe that it is important for OBIEE projects to have processes and procedures in place to ensure smooth operation and administration of OBIEE services. I have witnessed how issues related to OBIEE administration can wreak havoc on development deliverables and timelines, contributing to delays and unnecessary costs.
When discussing cloud solutions one of the first things I’m always asked about is information security. I understand the need to control data access and the general fear of your data getting into the wrong hands. However, what many people fail to realize is that in many cases your data is more secure when hosted with a cloud provider. Many would call this a counterintuitive suggestion but in fact it’s an interesting paradox.
Several tasks within the IT landscape often take much longer than anyone would like. One, in particular, is data migration. Data migrations are common across every IT organization and are necessary for many reasons.
Beginning on July 8, our customer set out to transform their startup infrastructure of a single domain/file/print server to an enterprise class Office 365 environment. The goal was to do this with a flexible compute center at Amazon Web Services. About 80 people were involved. They were using an Application Service Provider (ASP) for Microsoft Exchange services.
With the explosion of structured and unstructured data and affordable technologies to handle it all, the importance of choosing the right Business Intelligence (BI) for Big Data has never been greater. With the ability to sort, store, and analyze data, enterprise is able to maximize its data investment with Business Intelligence. In this blog, I will help you to navigate your journey through BI options to find the right match for your needs.
We are concluding this blog series by looking at predictive analytics, the tools that use structured and unstructured data as the basis for making impactful decisions. Predictive analytics enable key decision-makers, whether automated or human, to interpret data and use it to forecast outcomes.
Recently I talked about how changing seasons can affect computing needs. It’s inefficient to size your environment for only a few months of heavy activity. A more efficient solution is to use a cloud platform to scale your environment as needed. Now that we’ve established the concept, let’s examine how this actually works. Each cloud provider has their own implementation method and for this example we’ll use Amazon’s.
It’s hard to believe that OBIEE 11G has been out for over two years. Since its first release in July 2010, there have been many great improvements in the OBIEE 11G product. In this blog, I will share a few of the useful new features in the latest release (22.214.171.124).
This is the third blog of a three-part series (Part #1 , Part #2) exploring the Oracle Business Intelligence (OBI) certification process. In this blog post, I will review different preparation strategies for the Oracle Business Intelligence Foundation Suite 11 Essentials Exam (1Z0-591) and provide general test-taking tips.
Data is everywhere — quotes, bookings, leads, campaigns. Salesforce.com (SFDC) helps you manage and distribute this data, yet how do you get the most value out of this data? Do you know who your top performing sales reps are, or how your forecast accuracy has trended over the last three months? How would you map sales through to campaigns, or understand how long it takes a new rep to effectively build a pipeline?
Let’s move on to NetApp’s implementation of SSDs in its cache lineup. This feature is called Flash Pool, and it gives customers the ability to add SSD-based cache to an existing hard drive-based aggregate. This is an aggregate level read/write cache. That is important because unlike Flash Cache, it is persistent across takeovers and givebacks, and caching policies can be set at the volume level.
With fall upon us and the leaves turning colors, it’s a good time to think about how seasons affect computing requirements. Many retailers earn the majority of their revenue during only one period of the year. This is never more apparent than what happens each and every holiday season. Stores all across the country run sales and promotions, while increasing inventories to meet holiday demand and excitement.
You wake up and find your clothes have been selected for you. Your wardrobe knows what the weather is like and who your meeting is with so it’s selected the appropriate clothes for you. Based on traffic reports, your phone tells you exactly what time you need to leave to get to your appointment on time. Your car has had a say in the matter, noting that you need to make sure you stop for gas on the way home.
Not too long ago there was a time when people would sit in rooms and debate the benefits of system virtualization. Go back a few more years and people would hotly debate the benefits of storage networking. I remember people being vehemently against NAS and SAN technologies. If you were to debate system and storage virtualization today with the same vigor that I saw early on in their life cycles you might be laughed out of a room.
While it sometimes mirrors traditional BI, data mining differs in its ability to dig deep into a quarry of information. With the right tools, the process can yield valuable and intricate intelligence. As we continue to analyze the BI Maturity Curve with this blog, we delve into data mining.
We have certainly seen a lot of industry buzz centric around the improved efficiencies and cost saving afforded through the use of a Cloud enabled Converged Infrastructure in the data center, and while there is no question that a such a solution can reduce the cost and complexity in deploying your organizations cloud strategy, there exists a potential gremlin that can quickly rob you of any savings in terms of cost and labor if not properly managed.
On the first day I started working with AWS (Amazon Web Services) I provisioned a few EC2 (Elastic Compute) Linux instances and I thought I was well on my way to being a cloud expert. I was sure in for a surprise! I set up an FTP server, a web server, and a few other options and was just playing around in the environment. I felt pretty good by the end of the day. I shut down my instances to save costs and wrapped things up for the evening.
I have always enjoyed attending Oracle OpenWorld in San Francisco. Given the wide breadth of solutions that Oracle offers, the event has given me great insight into industry trends and what is happening in the global economy. It is also easy to see what is hot with data management, analytics and industry specific solutions. This year will be even more fulfilling since I have been selected to speak at the conference.
In order to help a business successfully hit its full potential, the ideal Enterprise Performance management (EPM) platform meets the needs of financial reporting/planning as well as information technology demands. This blog examines each of those interests.
This sounded like a good thing to hear, but unfortunately it was the worst thing to hear. Why would a good backup not be recoverable? Why would hearing that all of the backups were successful be a bad thing? It was a bad thing to hear because it confirmed that the data was lost. Now hold on, you might be thinking, “Why did that confirm a data loss? How can good backups be bad?”
When a business looks to unlock potential, Enterprise Performance Management (EPM) is a lead platform. It addresses the many challenges businesses face, from achieving profitable growth to delivering consistent performance and standardized processes.
Welcome to the first entry of my new bi-weekly blog where I’ll be sharing my experiences here at Corporate Technologies as we move our customers to the cloud. I’ll be tackling specific items using real world examples and offering key insights into what’s working for our customers.
NetApp’s Flash Cache has been around for some time. It started off with PAM (Performance Acceleration Module) cards and then Flash Cache, and currently it includes Flash Cache 2 —boasting cache sizes up to 2TB per card that can be combined in configurations of up to 16TB of intelligent read cache in a single system.
Continuing with our series, we will further the discussion we started in our initial Operational BI post with a focus on Traditional Business Intelligence. Traditional BI focuses on understanding data obtained from different sources within the enterprise, frequently centering on high-level metrics and KPIs unobtainable through operational BI. The ultimate goal is to have a single source truth reporting system that would cover any useful existing data.
Operational BI helps managers to focus and optimize critical business processes daily. The primary purpose is to access appropriate data in nearly real-time fashion (these days it is even possible to start a discussion about real-time delivery) to be able to make decisions that can influence a positive business outcome.
Not too often does a storage technology come around that gets me excited. Yes excited. I know what you’re thinking: How can a new storage technology design be exciting? Well, if you run a business that could benefit from non-disruptive operations, proven efficiency, and seamless scalability at the storage layer, then get excited about NetApp’s Clustered Data ONTAP.
Failure is the biggest fear of any applications team or architect charged with deploying an Enterprise BI system; the second biggest fear is success. A well thought-out selection and deployment of a BI system leads to increasing acceptance of the tool and growth in demand, the rate of which sometimes surprises even the most experienced architects and managers.
Data Integration Hub was one of the more interesting announcements at Informatica World. Almost all of us in the integration business have built something like it at one time, but now it is a product built on the best that Informatica has to offer. With the ability to intake almost any kind of data, store it in the Hub and then publish it out for subscriptions, point-to-point integration is now simplified.
Oracle Database 12c has been in development for a few years now and the anticipation over the release of it has been growing since Oracle OpenWorld last year, when Larry Ellison made a formal announcement about it. To very little fanfare last Tuesday afternoon, Oracle released the new version of their database, Oracle Database 12c. With 12c’s new features, Oracle has solidified themselves as a true player in the cloud computing space.
As soon as possible – possibly even before the BI tool selection has been finalized – the business must be consulted and guided to understand the best approaches to security
Continuing the list that was begun in Part 1 of this series, these are the personnel who will be needed in Phases Two and Three, and some of the key questions that they will need to answer:
Phase Two – Bracing For The Worst
In 1999, during the height of the tech bubble, I was working for a trading company on Wall Street. With over 1,000 traders on the floor, every microsecond that a trading station was down, the company would lose money. As trading volume increased, a station’s CPU, memory and network utilization would be taxed proportionally.
IW13 proved once again to be an exciting educational experience about the new products being offered and the improvements of existing products. But most exciting of all was listening to customer testimonials about their business problems how they are solving them by creating new and innovative ways to use the data available to them in their organizations.
This is the second post of a three part series that explores Oracle Business Intelligence (OBI) certification. In this blog post, I will be overviewing the Oracle Business Intelligence Foundation Suite 11 Essentials Exam (1Z0-591) and providing information that could be helpful in your pursuit of this certification.
This post provides three key security recommendations for installing OBIEE in your enterprise environment. 1. Be serious about password management. It is important to lock down the OBIEE environment and changing default passwords after installation. The passwords for weblogic user should not be simple to guess or widely distributed.
The term “flash” has been tossed about in the storage industry for years. It seems like everyone has a story around flash but NetApp approaches the flash technology in their storage systems a little differently from other storage vendors. From NetApp’s perspective, flash is all about storage efficiency.
This is the first blog of a three-part series that explores the area of Oracle Business Intelligence (OBI) certification. From my own motivation in pursuing certification to providing resources to assist those who are considering becoming an Oracle Business Intelligence Foundation Suite 11g Certified Implementation Specialist (1Z0-591), my hope is that this series will be helpful as well as perhaps entertaining.
When an IT group considers enterprise-class Business Intelligence (BI) platforms, the first concern is to fulfill the functional requirements of the business. However, the scope of the effort required to deliver that BI functionality to the end users is often underestimated.
Let’s say you’re a business analyst who’s been commissioned to create the specs for a portfolio performance report. Your task is to combine information from the back office database, the middle office trading application and a spreadsheet the portfolio manager has maintained since Excel was introduced. While it could be done with the current toolset, it would take months to create the new data model, a database, and the ETL to load it.
What would happen if a major retailer that you’ve been doing business with for 10 years suddenly had no record of your purchases? Imagine if the VIN on your car didn’t show up in your car manufacturer’s records? What happens when you go online to place an order and the site is down? Are you a loyal customer that waits for the site to come back, or do you move on and buy the same product from another business?
In this post, I’d like to share three best-practice tips about OBIEE visualization. Fortunately, there are numerous tools that you have in order to bring the data to your users in an efficient way. There are books written on the subject of dashboards and reports. However, I hope these simple tips might prove useful to you.
It’s a beautiful Saturday in spring and your child has his first little league game. All week you’ve been looking forward to the game. Now you can’t attend because you’re stuck in the data center. The one hour data recovery project you were supposed to quickly do Friday evening has turned into a full weekend project.
Oracle Exalytics is a high-performance, in-memory analytics machine that is extremely efficient in running analytics and business intelligence applications. It consists of a powerful hardware (40 CPUs and 1 TB of RAM) as well as Oracle Business Intelligence (OBIEE) product suite with several in-memory optimizations not available in standard OBIEE installation such as TimesTen.
With most businesses closing out their first quarter, now is a right time to look ahead at upcoming BI trends in order to plan for success in 2013. Here’s what I think will be the most important BI trends to consider moving ahead for the year.
I was shocked to discover that more than 175,000 children worldwide are diagnosed with cancer each year. This is alarming. What is even more alarming is that in the United States, more children die of childhood cancer than HIV/AIDS, asthma, cystic fibrosis, congenital anomalies and diabetes combined.
I suspect that there are more than a few voters in the U.S. that are not completely satisfied with the performance of candidates they helped elect last November. Campaign promises point to a bright future and positive outcomes, but actual performance does not always meet rosy promises.
Predictive analytics is a hot issue in today’s business and information technology world. People in the business intelligence community are both fascinated and fearful at the same time. Fascinated – because the progress in computing technology, such as decreased storage costs and cloud computing enables us to explore great new capabilities.
I have been a car guy all my life, responsible for not only one, but two full frame restorations of classic 1960s Pony cars. During the builds, I took an enormous amount of time and energy researching the various parts and components that I wanted for my tricked-out ride.
This is the mistake that closes the series. It happens frequently in Bi implementations and unfortunately, at times there is always a good excuse for it, such as lack of BI strategy roadmap, budget issues, lack of buy-in, and political environment. This approach is fatal, because the team chooses to ignore certain critical project issues from the start.
What’s the use of a new BI (business intelligence) tool if no one is using it? Imagine a situation where you were given a shiny new car on your birthday, yet you have no idea how to drive. It does not matter how powerful it is, you will not be able to use the car until you learn how to drive.
What is Big Data and what does it mean to you? This was one of the hardest terms for me to get my head around, since the word “big” is subjective and needs to be associated with something specific that is clearly defined with comparative elements.
So far, I’ve discussed two of the 5 Common Business Intelligence Project Mistakes: not including operations/infrastructure colleagues on-board early in a new BI process, and not creating a proof-of-concept for your BI project. This post covers the third common BI project mistake.
In my first post on 5 Common Business Intelligence Project Mistakes, I discussed the first common mistake of not bringing operations / infrastructure colleagues on-board early in a new BI process. In this post, I’ll talk about the second most common mistake, provide a specific example, and explain how you can avoid making it.
Some Things Never Change As long as there have been computer administrators, there have been users who generate data, need support and have computer related problems. These challenges have been elevated due to today’s fast paced, dynamic and evolving IT environments, which are becoming increasingly virtualized.
In this series, I highlight the five most frequent mistakes I have encountered in a Business Intelligence (BI) project, and how to avoid them. Hopefully, you are reading this while still in the planning phase of your implementation. Regardless of where you are in your BI plan, however, these tips can be very useful, since BI strategies are constantly growing and changing with the business.
“Owning and operating enterprise compute and storage equipment is expensive and labor intensive. By using our subscription services you get only the computes and storage you actually need for far less expense.” I delivered that pitch 30 years ago selling CSC Infonet timesharing services in Manhattan.
Not an easy question at all but the challenge is yours and it should be one of interest. Classifying data management is no easy task but I feel it’s something you must consider. Data management impacts almost every aspect of our business. So what is it?
Your business continuity (BC) plan is only as good as the effort you put into it. In the case of the recent Superstorm Sandy, the ability to recover data and stay up and running was a huge geographic challenge for many businesses.
There is no escaping the buzz about big data. Why is there all this interest in big data now? First, there is a lot more data being generated online today. There’s a greater volume of human-generated data – including new sources like social media, photographs, and email.
As we still clean up from Hurricane Sandy businesses that were offline for any period of time are asking themselves the same question, “After all the money we invested in Disaster Recovery how did we again still have downtime?” The answer is actually simple.
According to a recent Quantum survey on disaster recovery, a whopping 90% of IT decision makers surveyed felt their data was vulnerable in the event of a disaster. In addition, according to Quantum’s findings, “Twenty-seven percent experienced some form of data security incident in the last year, only 15 percent of which were due to natural disasters.”
In our recent blog entry we described Agile BI and why it is very important for organizations to apply it. In this entry we focus on what makes a successful Agile BI Project. Proper expectation management De-mystifying Agile Bi means slavishly practicing expectation management.
It always makes me feel good when I can help a client utilize the technology they own to solve an issue. I was recently working with a client to add additional database copies to their Microsoft Exchange 2010 environment. We went through a storage design and provisioned the solution easily enough.
Just mentioning Agile BI feels good. Everybody wants to be agile and Business Intelligence is the holy grail of business management. Truth to be told, Agile BI is understood by some, practiced by few, and desired by many.
The last time I was at Oracle OpenWorld Sun was an exhibitor and sponsor, an iPad was something you got from an optometrist and if you asked somebody what their cloud strategy was they might tell you that they were bringing an umbrella.
Do you want to make smarter business decisions by harnessing the analytical power of new computing technologies? Do you want to deliver on performance SLAs with the LOB despite ever growing data in warehouses and applications? Do you want to use innovative data integration techniques to meet the fast-changing demands of the business?
More and more, the brain trust seems to think that IT is incapable of cost effectively architecting and managing IT Infrastructure. Being a storage admin for most of my career, I believe in allowing talented engineers on your staff architect and deploy IT infrastructure solutions specific to your business goals and objectives.
It’s 6:30pm on a Friday night, and your director, VP, or worse the legal department, needs some specific data from three months ago, and they need it now. How confident are you that you can recover it for them? And how long will it take you?
At the beginning of the first Indiana Jones Movie, Raiders of the Lost Ark, actor Harrison Ford braves an ancient Peruvian Temple challenged by daunting obstacles. Drawing upon his bravery and ingenuity he is able to retrieve a biblical artifact. He thinks he’s home free but unfortunately, a huge boulder breaks loose, rolling towards him and he has to flee the cave into the waiting arms of one of his competitors and a hundred poison dart-shooting indigenous Hovitos.
Welcome to Corporate Technologies’ blog. We will be sharing content and ideas with you around central themes – Planning, Preparation and Innovation – and we hope that you’ll share your thoughts with us. Looking forward to seeing you back here often.