WELCOME TO KEATON CONSULTING.
★★★★★
“An invaluable partner…”
Keaton Consulting has been an instrumental partner across multiple disciplines. Their expertise in Performance Test Engineering, Tooling and Test Automation, along with their adaptability, has enabled us to deliver on numerous engagements of varying scope, size and subject. Their commitment to excellence and their collaborative approach make them an invaluable partner.
We Illuminate I.T.
We provide services in IT specializing in software test automation & performance engineering as well as strategic insights into the organization’s people, process and technologies for effective IT governance.
★★★★★
“A joy to work with…”
I have worked with Keaton consulting for over 2 years doing performance testing for our key critical systems.
They bring great knowledge and expertise to the table and are a joy to work with.
Client Stories
Dodd Frank and US Congress
The Dodd-Frank Legislation targeted banks, mortgage lenders and credit rating agencies deemed ‘to big to fail’ (SIFI’s or systemically important financial institutions), mandating compliance within 2 years of its passing. The KC Customer had to split ledger line-item reporting as part of the enhanced public disclosure component; as a result, millions of consumer statements tripled in size. KC was brought in to evaluate whether the system could maintain performance expectations with the increased size of data objects. System failure would not only have impacted the customer experience, but also would result in a significant federal compliance issue.
The KC Team built and executed a comprehensive performance test plan, collaborated with technical teams to tune systems to handle the new load profile and produced formal results that were part of the report of compliance to US Congress and archive.
IT Leadership – SI Governance
A large nonprofit whose primary fundraising activity is the sale of bakery goods is highly dependent upon a digital platform to manage communication between the volunteers, members, and bakers involved. They contracted out the development, management and maintenance of their cloud-based system to a system integrator which included a commitment to a DevOps delivery plan with extensive unit tests as part of their continuous quality improvement mandate. During executive readouts and numerous proof-of-concept demonstrations the subcontracted team could not show any metrics to illustrate progress.
The KC team integrated sophisticated BI analytic solutions into multiple IT delivery systems including Jira, SonarQube, Git, and Jenkins to convey insights into the efficiency and effectiveness of the DevOps teams. The solution revealed over 1,100 unit tests that were executed as part of the pipeline build, but the test failures were not fully exposed to the DevOps team, exposing a concerning process breakdown. The teams quickly enhanced their process to investigate regression test failures and increased system reliability. The simple dashboard relating contracted deliverables to actual work delivery and quality promised gave the leadership visual confirmation of proof of work and allowed for the relationship to scale.
Keeping a Cruise Line Afloat
A large cruise line was investigating their online booking system, which was not being adopted by end customers as quickly as expected and hired the KC team to perform a load test. Journey maps and click streams were developed to identify the “happy path” use cases to automate for the test as well as metrics from production for the checkout funnel.
KC team identified 2 major end user experience issues:
When searching for a cruise, a user had to select a departure port or cruise ship along with a day of departure before seeing the results of a search. Only after executing the search was the customer made aware that a cabin preference or entire cruise was sold out. This would force a customer to go back and select a new cruise to perform a search on. An analysis of Production data found that over 50% of search users would drop out of a search session after 2 or 3 search attempts that resulted in a "sold out" end step.
When attempting to Book a cruise, all fields on multiple forms had to be filled out for each individual reservation prior to selecting a dinner seating preference. When the dinner seating preference was not available, the system required the user to re-enter all of the data. As a result, only 10% of all bookings were actually completed.
The KC Team reported the findings to the business recommending a workflow redesign. As a result, the cruise line was able to increase the number of direct bookings, thereby saving millions of dollars by avoiding the 10% fee paid to travel agents.
No Limits to Business Growth
A large online prescription fulfillment company was excited about projected exponential year-over-year market growth but was concerned that their order and fulfillment systems would not be able to handle the projected load. They hired the KC Team to build and execute load tests against the system to identify the maximum capacity before failure; when normalized against projected growth rates, failure was forecast to strike in 18 months.
The customer then asked the KC team to test 2 system configurations at peak forecast capacity in order to determine the most cost-effective infrastructure option to handle larger scale. The KC team not only found a $2 million difference in the proposed solutions but also identified that the capacity issue would not be addressed by either infrastructure solution. Instead, the issue was code related and required a full-scale logic redesign.
The organization was able to save the infrastructure budget and reallocate the money to invest in a reconstructed codebase which would be delivered before the 18-month runway expired
Black Friday is GO!
A children’s clothing customer was preparing for peak season and requested a performance test for their eCommerce site that was recently moved to the cloud. They had run aggressive Black Friday campaigns online and in-store promoting the coupon codes that would be valid for “one day only”. Even with hyper-scaling in place, they were concerned that this “lifted and shifted” system would not handle the expected 125% traffic volume of their last year’s high season load.
The KC team designed and executed multiple test scenarios and found infrastructure, database, and configuration issues including uncompressed product images, resources blocking parallel downloading, and server “keep alive” timeouts undefined. The iterative nature of this engagement allowed the development team to reconfigure the system and the KC team rerun the tests until eventually the system was fine-tuned to handle 380% of the projected peak load before reaching an error state. As a result, the system seamlessly handled all Black Friday traffic as expected with no outages.
More Telephone Poles Please!
An energy company servicing the Mid-Atlantic states was preparing for a federal audit on disaster recovery procedures, which included their software systems. In order to be as realistic as possible, the teams selected a storm that had occurred 20 years earlier (which included a devastating tornado) as the environmental scenario to simulate. Analysis of that storm determined that, on average, customers had 3 methods of communication: cellphone, internet and landline. Some communications infrastructure was taken out during the storm, which resulted in communication challenges at the time; the concern of the team was that the problem would be exacerbated in a repeat event because of the large number of additional neighborhoods built in the past decade.
The KC team defined the load profile by modelling the registered addresses, estimated area population, and the percentage of accounts that were impacted by the disaster. During scenario execution we discovered that a software enhancement, which automatically ordered a new telephone pole when service calls came in from registered service addresses, would move the system into a system error state if requests exceeded inventory. As a result of the error, the customer would be unable to receive a disaster outage update. The software was fixed to accommodate the unexpected constraint, and the energy company was able to pass the audit.
Accelerating High Quality
A large database software company launched an enterprise initiative to accelerate system testing for customized commercial off the shelf (COTS) software. Executing the initiative required training 300 employees (mostly business analysts and manual testers) on both the new methodology and the functional test automation technology. Because there was a significant variation in the skillsets of the team, standard training material on the technology was irrelevant and a customized program needed to be developed.
The KC team designed, developed and delivered 3 separate courses, each containing over 20 hours of lectures, demonstrations and labs that were optimized to provide usable content for actual test automation activities depending upon the student’s familiarity with programming. One key concept of the courses was how to assess what tests are easy to automate and what tests were more challenging. For the difficult tests, employees were trained to calculate the ROI of automated test development and delivery time versus current manual test execution time normalized over 1 year. As a result of the training, the company was able to successfully implement the initiative and hit their automation KPIs of both improving quality and accelerating software releases.
Modernizing Quality and a Mission-Critical Application
A legacy system, which was the cornerstone technology in delivering value to their customers, had become unmanageable after several evolutions across multiple decades. Beginning from an origin on the mainframe, the software was transformed into a desktop-based client-server application and now was transitioning to become a web-based application that would support direct customer interaction. Competitive pressure was driving the shift, and the business required a fully-functioning, high quality application to be produced as quickly as possible.
The KC team was brought in to establish a robust quality practice, define and measure quality metrics, and build the functional test automation assets. The team began by clearly defining a sophisticated requirements framework in terms of KPIs/business objectives, GRC (Governance Risk & Compliance), Personas/Use Cases, Data requirements, and System requirements. The requirements system was then integrated with an automated test bed to report both test coverage and pass/fail. By leveraging Agile and DevOps methodologies, the program was able to deliver an audit-compliant minimum viable product (MVP) that met the most critical needs of the customers.
★★★★★
“A terrific partner…”
Keaton has been a terrific partner delivering technology solutions to our clients. They focus on the fundamentals and bring real expertise, but what really sets them apart is their ability to identify patterns and provide actionable insights through smart analytics.
They’re easy to work with, responsive, and understand our goals. Plus, their pricing is fair and transparent. If you need a reliable partner who delivers results without any hassle, I highly recommend them.
How We Can Help You
★★★★★
“truly understood DevOps…”
Keaton was the only partner I worked with that truly understood DevOps and its impact on Quality. They were a critical source of feedback regarding Octane and Value Edge from a reporting, analytics, and functionality perspective.
Let's
We know that it takes to be successful. We strive to deliver value-added insights, not simply to check boxes on an activity list. We listen, we collaborate with our customers and we deliver.