Automation Streamlines Cost Center Management

Challenge

Our client’s finance department tracked spending and budget authority using multiple spreadsheets. Requiring many steps, the existing approval process for cost center authority was cumbersome. Historical data was not easily tracked and was compiled separately and all approval data needed to be entered manually. This challenge was exacerbated by annual reorganizations which created the need to update approval data. The resulting approval tracking process was time consuming and was prone to security, legal, and financial risk.

Our client needed a better way to track budget approval authority at a cost center level, showing who has the power to release funds for each cost center, with the flexibility to update the system easily when organizational cost centers changed.

Plaster Group was asked to provide a technical solution to mitigate risk, increase approval tracking efficiency and quality, and implement the business processes to support it.

Approach

Plaster Group worked with our client to understand the complex system of authorization and cost center structure. This insight would be used to design and implement a streamlined process using existing SharePoint and InfoPath technologies. In order to reduce complexity and the chance for error, the solution would require effective automation and a single source for tracking this sensitive data.

Solution

Plaster Group consultants used InfoPath to build a Delegation of Authority to spend organization funds form. Key features include:

  • Tracking accomplished in one SharePoint location with a single InfoPath form
  • Logic within InfoPath form consolidates the multiple spreadsheets presenting different options based upon choices within the form
  • Workflows notify appropriate authorities when form is submitted and as the approval process progresses
  • Other physical attachments, such as the approval signature on file, may be affixed to the form

Results

Our client replaced a manual process with a budget authority tracking system that is
customized, flexible, and automated. In addition:

  • By using existing technology to develop the custom solution, our client saved money
  • Spending authority and the history of cost center-authorized staff are tracked in one list
  • Historical changes in finance authority are automatically tracked and may be reported as needed
  • Multiple-step workflows manage the authorization process, with status viewable at the SharePoint site, reducing delays and human error
  • Updates to the form and workflow process may be made as needed to reflect our client’s current cost center and policy structure

Predictive Analytics: An Overview

by Plaster Group’s Data & Analytics Team

Meet Predictive Analytics

  • “The Dow Jones Industrial Average will rise above 20,000 by the end of 2014”
  • “Our Year over Year Revenue will be 15% higher in 2012 than 2011”
  • “The San Francisco Giants will win the World Series in 2014 and 2016.”

Which of the above predictions seems the most implausible? The first one has been seen on significant financial news web sites, with little to back the assumption that the current rate of increase (at the time) of the Dow Jones value would continue throughout the year. The second prediction was actually the basis for a major retail company’s planning and budgeting, even though the previous two years of recession should have indicated a more modest projection. The last one may seem silly, but it may turn out to be as accurate as the previous two, and it is based on past performance.

Predictive Analytics is the science of using historical data analyzed through descriptive statistics to develop a model (or algorithm) to make predictions about the future state of a given topic. Whether your company is making calculated predictions based on rational analysis of its performance, its customers’ behavior or conditions in the marketplace, your company is still making decisions based on predictions, they just may not be scientific and probabilistic. Whenever you order future inventory, create a hiring and training plan, or plan to expand and open new stores, you are making predictions that business will continue to grow. However, if you are simply ‘guestimating’ or making a gut call, and you are significantly off, you may leave potential revenue ‘on the table’ or you may incur unnecessary costs.

Techniques

Predictive Analytics is part of the broader discipline of “Data Science” (the definition of which is still being actively debated). Data Science can include data discovery, data management, descriptive statistics and probability, machine learning, pattern recognition, data visualization and other disciplines. But predictive analytics attempts to utilize these techniques to develop models for determining the probabilities of various outcomes given a certain set of input variables. Some common applications of predictive analytics include:

  • Marketing and Sales departments analyzing customer segmentation for targeting of direct marketing campaigns, cross-selling and up-selling and customer retention efforts
  • Operations and supply chain teams forecasting inventory levels, resource allocation, distribution models, etc.
  • Economic forecasting, risk analysis and fraud detection, employed extensively in financial services

The most common and familiar forms of predictive analytics techniques are: .

  • linear regression
  • multivariate regression
  • correlation and cluster analysis
  • nearest neighbor analysis
  • time series analysis

Some more advanced techniques gaining popularity are:

  • network analysis
  • market basket (or affinity) analysis
  • geospatial distribution modeling

While it is not necessary for a data analyst or data scientist to know the intricate details of the mathematics of the methods they apply, they should be well aware of the proper application of these methodologies and the pitfalls of their misuse.

Enter Big Data

The field of predictive analytics is converging with the advent of “Big Data” to provide companies with unique insights into customer behavior at price points and systems availability never seen before. Companies now have the ability to analyze web traffic, mobile application data, as well as billions of traditional point-of-sale transactions in volumes and with methods that were out of the reach of most IT departments in the not too distant past. However, business stakeholders that request these analyses and the data scientists that perform them should keep in mind the diminishing nature of the confidence intervals achieved by processing millions or billions of additional records. It is quite possible that a much smaller randomly selected sampling would yield adequately reliable results.

The technologies which comprise ‘Machine Learning’ leverage the results of predictive analytics as input to develop predictive models. Systems can then accept changing variables as parameters to launch automated programs with actions prescribed by the predictive model. A simple application of this might be if a company’s inventory controls automatically reorder items that might be getting low, or a manufacturing management system might shift production from an over-booked plant to a less utilized plant if conditions warrant. Monitoring system variables can provide feedback to the predictive model to refine its accuracy.

The application of machine learning programs in automated commerce is increasing in speed and consequences every day. The May 6, 2010 “Flash Crash” of the New York Stock Exchange is a famous example of automated systems reacting to algorithms based on predictive models. On that day, at around 2:40pm Eastern time, the Dow Jones Industrial Average plunged 600 points within 5 minutes, only to recover within 20 minutes. The huge spikes in trading volumes observed at the beginning and end of every trading day are caused by automated trading systems placing large volumes of orders based on trading algorithms that are based on machine learning behavior.

Tools and ERPs

The toolbox for employing predictive analytics has grown rapidly in the last few years. The marketplace for these tools has expanded from the traditional industry leaders such as SASDell StatSoftIBM SPSS, and MiniTab.

By adding advanced analytics modules to existing business intelligence packages such as Microsoft SQL Server Analysis ServicesOracle Data MiningMicroStrategy Data Mining, or SAP Predictive Analysis Library these vendors now offer incredibly powerful desktop capabilities to any data scientist, data analyst or business analyst. Extremely powerful Open Source tools also exist for predictive analytics, notably R and KNIME.

In Conclusion

Companies that seek to augment their existing business intelligence reporting capabilities with predictive analytics should evaluate their existing data management processes to ensure they are analyzing good, high quality data. In a high functioning organization, the insights discovered by the data science team can provide input and guidance to the data management team to improve the data quality overall, which in turn leads to better analysis and insight.

If you and your company would like to leverage the power of advanced analytics methodologies Plaster Group Data & Analytics Consulting offers expert guidance. Our team of seasoned professionals offer expertise in the evaluation, design and implementation of predictive analytics, machine learning, Big Data analytics and more, that can help you extract insight and business value from your ever increasing data assets.

Why You’re Ready For Microsoft Azure (even if you think you’re not)

Cloud Computing (Noun Project)

by Plaster Group’s Enterprise Software Team

The Cloud is Important

I’m going to skip the opening paragraph that explains why the cloud is Important. You already know.

The question is: how are you going to adopt the cloud in your solution portfolio? What’s the best way not to just use the cloud, but get massive competitive advantage out of it?

Azure is Microsoft’s cloud offering, and in this post I want to offer some perspectives you can take away on Azure and how it can fit within your existing infrastructure. I want to talk about some common, limiting perspectives on why it might seem difficult to move to Azure, and see why they’re not the scary monsters they might seem to be. I hope to show you that you’re ready to start moving to Azure as soon as you’re done reading this article.

Why Microsoft Azure?

First, I want to share at a high level my view of Azure and how it compares with Amazon’s AWS. AWS is the market leader in cloud computing, and many startups today use it. AWS is a great platform, with extensive infrastructure and platform services that let you build applications at web scale, including EC2 for compute, RDS for relational database, Dynamo DB for NoSQL, Auto Scale, and dozens of other features. I’m a big fan of AWS, and Amazon continues to improve it.

So why would a company choose Microsoft Azure? What does Azure offer that AWS doesn’t?

They both offer Infrastructure-as-a-Service, offering both Windows and Linux machines. They both offer high-speed, nearly-unlimited scale NoSQL, BLOB, and SQL storage. They both have rich management API’s to enable automation of every aspect of operations.

With that said, Azure also has all of Microsoft’s experience in how their products already work for you. They didn’t throw all of that away as they’ve grown and shaped their cloud offering. If you’re in a Windows-centric enterprise, Azure has been shaped to make your path to the cloud as smooth as possible. It’s made to be able to handle as much or as little of your infrastructure as you’re comfortable with, with little friction. If you’re a developer – web or desktop – Microsoft is working hard to give you a flexible architecture across on-premises and cloud deployments. And that’s why I recommend Azure over AWS for Windows-centric environments. Not because AWS isn’t great – it is – but because there’s real benefit for a Windows environment to use Azure over AWS.

(And if you’re in a Linux environment, don’t get me wrong, it’s awesome for you, too.)

Taking New Perspectives

Let’s look at some of the common concerns about moving to Windows Azure, and how we might address them or see them differently.

I don’t want my Active Directory outside of my firewall.
I’ll give two perspectives on this. First, the inevitability. Second, the benefits, since, really, it’s an enormously Good Thing when you see what it gets you.

First, it’s fairly inevitable that you’re going to sync your Active Directory to Microsoft’s cloud. Why? When you use Office 365, and want to integrate it into your environment in any meaningful way, you’ll have to sync your AD to Office 365, and Office 365 uses Azure Active Directory. If you have Office 365, then you already have a tenant in Azure Active Directory (AAD). (If you don’t, I’d highly recommend taking a look at it for your enterprise, and I’ll tell you why in a different post.)

Second, even if you have no plans to adopt Office 365, there’s enormous benefit to AAD. Microsoft gives you free identity federation with hundreds of common SaaS applications. Your users get a single-sign on experience, and you get control over your SaaS user accounts, so you know exactly how many users you’re paying for on Workday, Trello, LucidChart, and many, many others.

OK, but I’m definitely not syncing my passwords to the cloud.
Well, there are benefits for your users if you do, but the fundamental perspective I want to offer is that Microsoft is exceptionally good at security now. The bad old days of Windows XP and Slammer are well over. When you sync your passwords with the rest of your AD information, Microsoft re-encrypts the already-encrypted password, sends the file using TLS (to re-encrypt the double-encrypted passwords in transit), and then stores the password in Azure Active Directory with both levels of encryption in place. It’s as secure as they could make it.

When you use DirSync (Microsoft’s on-premise-to-cloud AD synchronization tool) to sync your user accounts and passwords, it means that you no longer have to run any authentication platforms like Active Directory Federation Services in your infrastructure. Azure Active Directory can handle all of your authentication needs.

Don’t forget: Microsoft runs Bing, MSN, Office 365, Azure, and Microsoft.com (itself a Top 50 site worldwide), among many, many other properties. They’ve got this whole IT and security thing figured out by now.

I’ll have to drill a truck-sized hole in my firewall to make it work.
Not at all. Azure Virtual Networks allows you to create a site-to-site VPN with Microsoft’s data centers using your existing firewall. You define the IP addresses you want on virtual machines in Microsoft Azure to match your internal networks addresses. You can even use your existing DNS servers. Everything works together just as if it’s one extended data center.

I don’t trust cloud performance.
I’ve been in this business for over twenty years, and one thing that has remained the same is that system engineers always ask for way bigger hardware than they really need, since they don’t want to be accused of sizing too small. Because of that, a lot of servers sit around running under 10% CPU and under 50% memory usage, draining more electricity than they really need. Moving your workloads if they’re already virtual (and especially if they’re still on physical servers) is an opportunity to right-size your Azure virtual machines based on their actual needs and usage patterns, not on the timeless tradition of over-specifying hardware. (Not that I’ve ever done that.) It turns out you can get a lot done even on small and medium instances, and the cost savings can be tremendous over running your own servers.

And if you need entire servers to yourself, Microsoft Azure offers instances with massive scalability, massive memory, and enormous local storage. It’s up to you.

Where it’s possible, using auto-scale is a great way to make sure your performance is consistently great at the lowest possible cost. Whether you’re using IaaS or PaaS, auto-scaling your instances is the best way to use the fewest resources while giving all of your users and customers a great experience.

Migration is hard.
If you’re already running System Center, then all you have to do is fire up the Azure Management Pack, and you’re connected to your Azure infrastructure. Combine that with Virtual Machine Manager, and System Center becomes a console that allows you to move virtual workloads back and forth easily.

You’re Ready

We’ve looked at Azure and AWS. We’ve looked at a few myths about being ready for the cloud, and offered some perspectives that answer those challenges.

Now, you’re ready to get started.

Plaster Group Cloud Computing consulting can help you navigate your move to the cloud. We’ll help you create a roadmap for your migration, and we’ll help you execute on it, saving you time and money while opening up new levels of flexibility and opportunity for your IT infrastructure to serve your enterprise.

Save money, increase flexibility, serve your organization: that’s what the cloud is all about.

HRIS Implementation of Role and Compensation Structures

New compensation structure provides clarity of roles and equity of jobs across the organization, aligned with the new total compensation philosophy.

Challenge

Due in part to the our client’s rapid growth in recent years, there were inconsistencies in compensation and job levels across the organization. To resolve this issue, and establish more accurate compensation models for all employees, our client began a process of reviewing every role in the organization to determine the typical compensation ranges for those roles, as compared to the external marketplace. With the goal of the this initiative to enable every manager to have a confident, candid conversation with each of his or her employees regarding that employee’s title, salary level, and salary in such a way that the employee understands the reasoning for each clearly.

Approach

Plaster Group’s Enterprise Software Team was approached to manage the IT components of this complex, enterprise initiative. The time frame for system implementation was short, constrained by the overall rhythm of business for the organization. The approach required defining comprehensive, end-to-end business processes that spanned the enterprise and then configuring these processes into the organization’s Workday HR platform. Managing and coordinating internal and third-party resources across the organization to a comprehensive plan was required to complete this effort with zero compensation related post-production defects.

Results

Our solution leveraged enterprise systems the customer already had in-house, augmenting those resources with specific third-party expertise necessary to complete the project.

  • Automated data conversion with triple check for accuracy and completeness
  • Detailed cross-organization check-list, ensuring 100% accuracy
  • Stand-alone role library, including both a private ‘work area’ and a public view, architected to integrate with multiple enterprise SaaS platforms

Results

The results provided the organization with a new compensation structure which included a library of new role descriptions, internal classification and grading of roles and robust external benchmarking to reputable market salary surveys. Employee self-service information was enabled in the Workday platform.

Technologies