Product Design & Development

Distill What You See Into Action

On Monday morning, you receive a phone call from a Senior Partner in your company. A client of your company is preparing to go live with a new software system in sixty days. The Senior Partner, knowing you have extensive, successful experience delivering technology systems to the marketplace that meet or exceed functional expectations, are secure-by-design, delight customers, generate revenue, and increase brand value, wants confidence that everything will happen as planned and desired. You can name your price, but it doesn’t change the fact that you have three weeks to learn, assess, refine, and present observations and recommendations. You have no idea how deep the rabbit hole may go. After some back and forth in the conversation, you accept.

You’ve done this before. You know what to do. You have a battle-tested framework for assessing large volumes of data in short periods of time to determine planned versus actual deltas and risk+probability remediation plans. This framework that you’ve developed over the years and myriads of assessment/salvage operations helps you not only identify what, why, and when, but also what not. In other words, while you’ve found success making observations and recommendations, the real magic has been identifying what doesn’t matter, what doesn’t need to be addressed, and/or what can safely be ignored for now, and perhaps forever.

Identifying what does not matter is often harder than identifying what does.

The assessment framework you use for these types of engagements is structured to help you discover project health and corporate risk while eliminating noise. After all, folks pay you to figure out the state and health of their investment in short, thorough engagements. And what they expect is a concise list of observations and recommendations that guide immediate decisions leading to crystallized outcomes.

You know there will be one of three possible outcomes:

  • Outcome One: This project is working well. You may recommend some changes here and there to fine tune the performance, but overall, the effort is heading in a good direction and should render the desired outcomes according to the expected parameters.
  • Outcome Two: This project is not working well, but is correctable. You recommend a number of changes that will bring the project back into expected performance. You additionally recommend some key areas (health indicators) of the project to keep an eye on from today through end of project so they minimize the possibility of ending up here again.
  • Outcome Three: This project is not working well and is not recoverable in a manner that makes financial sense. Your recommendation will likely be to close the project down, perform a retrospective, and use the output to influence project structures and decisions in the future.

From experience, you also know that some projects classified as Outcome Two should really have been classified as Outcome Three. However, after reporting your results to the Senior Leadership team, they were not yet willing to accept the possibility of a sunk cost effort and instead chose to believe doubling down on the effort would pull it back into the green zone. (Where “doubling down” = “get more people, work harder, spend more money”.)

The Assessment Framework

For each category below, research and understand what exists, what doesn’t exist, in what state it exists, and what must, could, should, or will not be done accordingly.

01 Problem Statement

What problem does this organization need solved? Would you categorize this as a business, technical, security, compliance, team member efficacy, or client-driven problem? What is the known/perceived blast radius of this problem statement impact — the industry, your target market, your enterprise, or localized within an enterprise?

02 Desired Outcomes

What change must be realized as a result of this effort (expenditure)? What will this/these change(s) look like to the affected parties? Will they care and why?

03 Definition of Done

How does this effort, team, project, or program know when it is done spending money?

04 Constraints/Attributes

Are there parameters, boundaries, and/or attributes to which this engagement must adhere, meet, or otherwise evidence compliance. Examples: Financial (SOC), Health (HIPAA), Security (NIST), Privacy (CCPA), System Availability (5NINES), budget, time, capacity, risk appetite, traceability, auditability, etc.

05 Dependencies

What dependencies exist that may complicate, inhibit, or otherwise preclude this effort from successful completion? For example, will this solution sell itself into an existing market or do we need to create market demand along with introducing a solution? Are we innovating on existing things or green-field inventing? Do we understand the target market? Are teams skilled correctly?

06 Team

What roles were initially requested to make this effort happen successfully? How has this changed through the course of the effort? What exists today? What should exist? Is the team being simple enough? Are they thinking big enough? How frequently, if at all, is the team being given opportunity to assert, test, learn, and change? Is this an adaptive or inflexible project environment?

07 Work

How is work (deliverable) discovered, defined, prioritized, and realized? Is there more than one backlog? Is there more than one priority? Who are the stakeholders? How are they involved? What is the definition of done? What implementation choices have been made and how will they impact long-term solution viability, cost of ownership, staffing availability, training, and competency?

08 Money

Did there exist an idea of how much money would need to be spent in order to realize the desired value? If there is money awareness, are there planned versus actual details? A known run-rate? Remaining spend projection?

09 Commitment

What was the original delivery commitment? What is currently delivered? Is there a delta? If yes, why? If there is a planned versus actual delta, will this effort require remediation activities now, later, or never?

10 Risks

What are the risks which may impact successful implementation, daily operations, and customer delight? What is the associative probability of each occurring? What is the associative impact of each occurring? Particularly, though not exclusively, what are the elimination, mitigation, and/or remediation options for each?

To be a healthy, useful, value-driven and value-realized investment, projects of any size, in any size organization, all require these above elements in some way, shape, and form.

At the end of this effort, you typically have what you need to ascertain investment to return potential and make your recommendations to the Senior Leadership team.


People Operations Focuses on Bringing Even More Value to Team Members

Who you work alongside matters. This is a core belief shared by the people of Trility Consulting®. This shared value plays a role in ensuring they work as a team in delivering the highest priority outcomes to clients.  

Who you stand beside matters. I am grateful for each and every one of the people who have joined and committed to this journey together.

Matthew D Edwards / CEO

This journey has grown to include more contracts, more clients, and more team members, so Trility leadership realized this small startup was ready for a formal People Operations team and announced the following promotions and hires.

Jennifer Davis promoted to Vice President of People Operations.

As one of the first employees hired in March of 2017, Davis has reliably served Trility as needs arose and exponentially grew. “We worked with Jennifer before Trility as consultants and quickly realized she would always deliver and always with a smile,” said Edwards. “From the start, she has always embodied the spirit of People Operations – caring and serving others and putting people first always. She continues to make Trility a better place for each of us, so she was the natural person to step up and lead People Operations.”

Kori Danner promoted to Talent Delivery Manager.

“Finding new talent is a lifeline for Trility, and Kori has been at the heart of finding people who exemplify what Trility embraces – honor, professionalism, and keeping promises from Day 1,” shared Edwards. “Our ability to scale and not sacrifice how we deliver work is directly related to Kori’s consistency in creating connections and forming reliable processes that earn and increase trust with people considering joining our team.”

Megan Hanna joins the team as Talent Sourcer.

As the newest addition to People Operations, Megan is focused on helping future talent understand Trility’s culture. “Megan has a very natural way of connecting with others and will be integral in helping cultivate Trility’s newest teammates,” Davis shared.

We’re Growing & Hiring

While People Operations is now a formal team at Trility, its directive is not new. These three team members will elevate and scale how to create value for Trility’s existing and future team members.  As one of the Inc. 5000’s fastest-growing companies, Trility welcomes conversations with people interested in becoming more today than yesterday. View our current openings on LinkedIn or connect with a recruiter.


Creating Value in Relationships is No. 1 Priority for New Role

As a longtime fundraiser in the nonprofit sector, Megan Hanna discovered she had a knack for building solid relationships and creating inclusive communities for donor and volunteer networks. This ability to connect with others and build a culture made her an ideal fit for the recruitment team at Trility Consulting®.

“Non-profit work is very relationship-focused and rewarding, but I realized I desired a position where I could work in a team environment – not just build one,” shared Megan, whose role as a Talent Sourcer is to identify individuals who are interested in delivering solutions with a team instead of being viewed as an “outsourced contractor” to clients.

Megan’s style of learning about others aligns with the values we seek in candidates. For us, it’s more than keyboard skills and expertise. We seek specific attributes that help ensure Trility builds solutions that consistently deliver values and achieve the priorities our clients expect in a predictable, repeatable, and auditable manner.

Kori Danner / Talent Delivery Manager

When it comes to success, Megan knows it’s “95 percent about building that relationship.” Her recipe for doing this is simple: Be transparent by being honest and forthcoming. “I want to know where I stand with others because this helps me grow and learn,” she shared. “So I look to provide the same experience for those who are interested in career opportunities at Trility.”

Along with the team environment Trility offers, Megan was excited about how the culture translates to team members who are geographically distributed around the United States. “I love the opportunity to work remotely but still feel part of a team on a daily basis with the communication tools and resources available.”

Megan’s two dogs, Mason and Kaylee, are also excited to have her working from home. 

Connect with Megan Hanna

Interested in joining the Trility team? Email or connect with Megan Hanna on LinkedIn.  

We’re Growing

Trility Consulting made Inc. 5000 list for fastest growing companies due to achieving 131% financial growth in three years and continues to have career opportunities for people interested in becoming more today than yesterday.


New Position to Bolster Successful Delivery Method

If investing time in others brings joy, then Cora Pruitt lives with a constant smile. “I love being of service to others and, to me, Agile coaching is about teaching people how to create solutions and perform in ways that work for them and make them successful,” Pruitt shared. Earlier this year, she chose to move from a contracted Senior Delivery Manager to a full-time team member with Trility Consulting®.

When the new role of Delivery Director was created due to an increase in projects and teams, she was the natural choice to lead and evolve this effort. 

A key factor is having people like Cora leading our teams and ensuring we deliver what is promised and that our teams observe more than they  are asked to do and offer options and recommendations from start to finish.

Michael Schmidt / Vice President of Delivery

“Part of Trility achieving year-over-year growth is due to consistently delivering value and tangible outcomes to our clients,” shared Michael Schmidt, Vice President of Delivery. “A key factor is having people like Cora leading our teams and ensuring we deliver what is promised and that our teams observe more than they  are asked to do and offer options and recommendations from start to finish.”

This new role provides Pruitt the opportunity to be a positive influence for all team members. “Trility leaders don’t hesitate to let us know how we are appreciated and that we are a part of the bigger picture. I’m excited to be a part of where we go next.”

Valuing productive conflict allows Pruitt to work with clients and team members to ensure observations and feedback align and work towards achieving the best, highest-priority outcomes. “I encourage everyone to keep an open mind for feedback. To succeed as a team you need the ability to embrace other opinions and have confidence to provide yours,” she said. “You don’t get anywhere if you try to sugarcoat things – the bigger the decision, the bigger the conflict.”

One of Pruitt’s finest moments serves as a testament for how she works to serve others. She shared having a former colleague shake her hand and say, “Thank you. Because of you, I enjoy coming to work.” This same contracting engagement led to the client nominating her for TAI’s Women of Innovation Award in 2012 where she was named a finalist.

Read how Trility Consulting's made Inc. 5000 list for fastest growing companies due to achieving 131% financial growth in three years. 

About Trility 

Comprised of technologists and business consultants, Trility helps organizations of all sizes achieve business and technology outcomes. Clients appreciate that our teams solve problems contextually and bring their people along to ensure a reduced cost of ownership long after the engagement is done. Areas of focus include:

  • Cloud and DevOps
  • Product Design and Development
  • Information Security
  • Data Strategy and Management
  • Internet of Things (IoT)
  • Operational Modernization

Trility is the only business and technology firm with a proven history of reliable delivery results for companies that want to defend or extend their market share in an era of rapid disruption. Headquartered in Des Moines, Iowa, with teams in Omaha, Neb., Kansas City, Mo., Denver, Colo., our people live everywhere, and we serve clients from all corners of the United States and globally.


Podcast, Part III: Bridging the Gap Between the Art and Science of Data Analytics

Show Highlights

Science is the iterative testing, results change over time with variables. For data science, what’s true today could dramatically or incrementally change tomorrow based on one variable. The art of it is accepting that there will be exponential opportunities to discover more, learn more, and communicate more to find value and purpose in data.

This final episode with Jacey Heuer provides insights into how individuals can seek opportunities in this field and how organizations can purposefully mature data science and advanced analytics.

Missed the first two episodes? Listen to them both: Part I and Part II.

Read the Transcript

0:00:58.1 ME: In our third session with Jacey Heuer, he helps us bridge the gap between the art and science of data analytics. We discuss what is required of people and organizations to explore, adopt, implement, and evolve today’s data science practices for themselves and their organizations.

0:01:18.2 Jacey Heuer: And so I really look at this as, again, bringing it back to science and art. Science gets you to the insight, the art then is how you tell that story and paint that picture to create comfort with some of that uncertainty that you’re now revealing in your data.

0:01:35.2 ME: So as it relates to individuals and organizations and the adoption of a more formal data behavior, through your experience and your perspective, the study, the work that you’ve done, how do we make this a normal, common daily conversation for people and companies instead of this emerging knowledge area that some people are studying?

0:02:05.9 JH: You’re right, the passion is a key component of this, right? I think passion across anything you’re engaged in is important to be able to find that and it’s a true driver motivators finding your passion. Mine is learning, happens to be with data science, and those kind of come together well for me. Just going a bit deeper into my personality with this too, is data science, as much as there’s science involved in it, there’s a lot of art involved in it. Personally for me, my background, I have an art background as well, in my past. When you think about left brain, right brain, creativity, logical, all that kind of stuff, it’s usually more binary, more definitive, and for whatever reason, I have some bit of a crossover in that. I can find enjoyment in both sides of that and it works well for me with data science, but what I think about from the standpoint of trying to wrap your brain around, what does this mean, how do I gain comfort in sort of the mindset that it takes to deal with and feel okay with ambiguity, uncertainty, right?

0:03:11.9 JH: I think so much and so often in business, which rightly so, it’s, I want to know definitively, 100% accuracy, what’s gonna happen in the future and so on. That’s a fair mindset, and I think there’s a lot of good leaders and people realize that’s not possible, and you make your own decision too. Given the information I have at hand, what’s the best decision I can make, and you go with that. Data science is really taking that human decision process, which you’re already dealing with, uncertainty, whether you’re aware of it or not, and just putting more support to quantify some of that unknown through data. And in that does require a new mindset of, the information I’m taking in may become more broad because I’m getting more data supporting the breadth of my decision-making, but then that also then becomes the realization and vulnerability of really seeing the uncertainty that the decisions I’m making, distilling those in my mind, that uncertainty in a way that I may not be aware of, but now because that data’s present, I’m aware of that uncertainty and becoming more potentially concerned with that uncertainty.

0:04:23.9 JH: And that’s where the side of the data scientist becomes vital and important, it’s a storytelling. And so how do you tell that story and manage the uncertainty that you’re now highlighting to a leadership or an individual that they might not have been aware of before? At least consciously aware of that is maybe the better way to state that. And so I really look at this as, again, bringing it back to science and arts. Science gets you to the insight, the art then is how you tell that story and paint that picture to create comfort with some of that uncertainty that you’re now revealing in your data.

0:04:56.1 ME: That’s similar to just about any career, I imagine, but I know explicitly in the technology side of things where there can be absolutely fabulous software developers who have not yet discovered that they have to also be able to communicate the goal and the journey and the value and manage that message and I wonder if that’s not a learned behavior for any human, but the fact that you’ve articulated the relationship between art and science all as the same collective responsibility, that’s really powerful.

0:05:37.1 JH: Science inherently is journeying into the unknown. Science is meant to constantly test and retest and so on. That’s what good science is, but there’s rigidity, there’s a tool belt that can be applied to that testing. It’s a known set of tools, generally. The art side, the learning there comes through experience, comes through vulnerability, comes through the willingness to test out, does this… From a data science perspective, does this plot with the dots on it mean more than the plot with the lines on it, does the bar chart mean more than the pie chart and so on and so forth, and how do I combine those together to get that message across, and at the same time, beyond the visual, it’s… Your written and verbal communication as well becomes essential ’cause you’re the one creating the confidence in this new idea that you’re bringing to the business.

0:06:37.7 JH: You’re bringing across… A good example I have would be the concept of distribution density plots, so it’s a very statistical term, basically all it is, you think about a normal distribution bell curve, it’s putting some statistics to that bell curve, just for example. How do you convey what that means to someone that has no statistics background? When you say the word density plot, their eyes glaze over. Being able to distill that down to elementary terms, do it in a way that gets your point across and drives the decision that, I think requires just stepping into the arena, finding and seeking out bits of that opportunity to challenge an idea, challenge a mindset with some data-driven visual, some data-driven insight and put it out there and see what happens. Again, science versus art, science, I think you can practice, you can get through history of defined techniques. Art is more, what works, I just have to try it.

0:07:47.4 ME: So I will amplify that to walk into my next question. Your statement was just, “I have to try it.” And part of my curiosity from your perspective is, let’s talk about someone in an organization who’s just now discovering the whole field of data on purpose. Doing data on purpose. So we’re not talking about just your historical typical, “Let’s create a 2D plot in Excel and call it a day.” We’re talking about trying to understand multiple dimensions of many seemingly unrelated things that when put together may actually reveal something that would never have occurred to our minds, we wouldn’t have seen because we weren’t looking for those types of things. For someone that’s just now figuring things out saying, “Hey, I really think that this might be a thing, I want to look into this.” We’re assuming that they’re starting in kindergarten, they’re starting with near zero. Where would they go? How should people get involved, get their feet wet, jump in? What do you see? What do you know? What would you recommend?

0:08:57.0 JH: Luckily, especially within the last decade or so, the learning options online, the open free learning options online have accelerated vastly. Like with a lot of things, a Google search for data science is a good starting point. There’s a number of open free coding academies. Coursera’s a great one, Udacity, things like that, not to market for anything individual, but it’s starting there as just this data science road map. What do I need to learn? What are the foundation skills to kind of build on? And getting a sense of what the scope looks like, I think starting with just that Google search can help define what are some of these terms and areas of this space that pop up and begin to emerge, things like statistics and programming, R and Python and SQL and kind of this whole space, just starting there with that cloud of what’s out there, to me, is always a good way to begin any project. What is my space that I’m living in? Really then what’s probably been most useful to me, it comes down to learning some of the core concepts and technologies, and then seeking out opportunities to practice and apply those, even if you’re stumbling your way through practicing, applying those, start trying to force those into whatever you’re working on right now, and it may not be the solution for your project at hand, but can I take a sliver of it and make it work from a data science lens to build up my skill set?

0:10:34.7 JH: To really give a maybe more concrete answer to things to focus on, I think it’s… Traditional statistics is a great place to start, and again, there’s a number of resources that are great for that, just through a Google search, statistics being what is the difference between mean and mode and what’s your range, min and max, how do I define a distribution? Things like that. Starting there, then moving into probability, probability is a big concept in data science, machine learning, so getting your mind around that space. You don’t have to be an expert in it, but at least becoming familiar with terms of probability. Probability Bayesian inference is another area that’s out there that goes hand-in-hand with probability as well, those three areas, traditional statistics, probability and then Bayesian inference, which has a lot of probability in it, are three sort of core foundational areas of this spaces, stats to be involved in. And then it’s moving into the technology side, so now you’ve learned and got a grasp on some of these statistical ideas, pick up R, Python. I’m an R guy.

0:11:46.3 JH: Python tends to dominate. Depending on your source, Python might be a little bit in front of R, it could go back and forth. Either one, the mindset I have is become an expert in one, but be familiar across both of them. ‘Cause you need to be able to operate on both sides, and either one of them, you can be working in R and you can leverage Python, you can be in Python, you can leverage R and go back and forth. There’s a lot of capability in the libraries and packages that are out there. And then as you develop the skill set of your technology, some of the base statistics, now start venturing into your machine learning, your AI. And depending on your source and your mindset, all of this really comes back around to developing the skill set to be an expert line fitter is what it comes down to. I say that kind of tongue and cheek, but really, anything you’re doing from a modeling perspective, it’s your taking your data set, which may be X number of columns wide, you can re-imagine that as being X dimensions in space, you have one dimension, two dimension, three-dimensional space, which is what we all live in. You can plot three dimensions on a plot relatively easily, but as you go up into higher dimensions, you can’t really plot that.

0:13:06.4 JH: That’s where a lot of the mathematics come into play then it’s how do you navigate a multi-dimensional space of data and be able to, out of that, to kind of, your thoughts earlier math, you distill meaning from something that in this multi-dimensional space, you can’t visualize and there’s no simple way to get your mind around it. That’s where machine learning and AI and stuff comes into play then. It’s those tools are effectively putting a pattern, finding the pattern in that multi-dimensional space that lets you either split it up or pinpoint a data point and so on. So that’s kind of the foundational skill set I think I would focus on, thinking about it. And then from that, there’s subsets and offshoots, you get into TensorFlow and PyTorch and all these other things into the cloud, all that, but that’s the core of where you really started when you’re talking about “What do I need to get into and start learning to go down this path?”

0:14:01.9 ME: So you led with, “Look for opportunities,” and then after that, I believe you said, “You need to go learn some fundamental elements of statistics.” And there were three different areas you were focusing upon. Then, “Go learn about some of the technology.” Then after that, you were talking about how you can start to take the statistics plus the technology and start discovering, seeking or otherwise applying that. So you’re starting to become operational at that point. So the first two steps are really classes of preparation, if you will, classes of data, prepare steps, but you start to become operational after you have those two classes of things under your belt in terms of familiarity, experiential pursuit that type of thing. So really three big steps. What you just communicated is a time-based journey of course, but I think one of the most valuable things you may have said there is, ultimately you have to seek the opportunities, or this was just an academic exercise of reading about this, then reading about this and then tomorrow there’s new subjects.

0:15:11.2 JH: Very true, and really, the reason for that is, space is so broad. I don’t think it’s unique to data science and this discipline, but there’s so many methods, so much research out there, problems are… There’s no standard, typically no standard problem. And so it’s really that process of, “I have a problem, now what are some methods that I can maybe force on that problem?” I tell you, I think the power… And again, I think this is common across many skills and disciplines, but it’s as you add breadth to your knowledge base, really a lot of the power you bring to your role as a… Your emerging role as a data scientist is not necessarily the expertise you have in a particular method or approach, but it’s the knowledge base you contain of what are alternatives to solving this problem. So now I have instead of one tool that I try to force onto this problem, I’ve got a selection of 10 tools that I can explore that space. I may not be an expert in all 10, but at least I know I can try 10 of those and find the one that seems promising and then really dig into that and become a deeper expert to solve that particular problem. That’s where, again as you step further into this career, your breadth of knowledge becomes greater and a lot of that skill set and value comes from, “I’m not a one-trick pony.” For lack of a better term, “I can pull from this tool set and find a better answer, the best answer.”

0:16:47.6 ME: Well, that is consistent with what you said earlier, which is, you’d like to be an expert in at least one, but functional and useful in both or all. To some extent, I can be an expert and a generalist, and that will take me further down the road than, “I have a hammer.”

0:17:05.0 JH: A lot of that, I think is just tied to the availability of information in this space. So I have the tools at my disposal to go and learn, and again, going back to some of the prior comments, having the passion to learn, being driven by some learning, identifying when you have that knowledge gap and then going, seeking out and learning that new tool set that previously you may have just been, kind of aware of, but now I know I might need it to answer the questions, so let’s go dig into that. Capitalizing on that motivation and building that knowledge from there, I think is essential as well.

0:17:44.7 ME: If I’m an individual, regardless of where I am on my career path, I’m new in my career, or I’ve been around for a while, or I’m in the later third of my journey, whatever it is, is really irrelevant. And if I’m an individual and I’m in a company and they’re not asking me, they’re not talking about any type of analytics, they’re not talking about BI, they’re not talking about any of this stuff. And I’m interested in doing this stuff, it’s probably on me to figure out, “Okay, where is my company? Where are they wanting to go? What problems do they want to solve? And how can I apply these things I’m exploring to proactively propose and find and encourage opportunities? And that might actually be a wonderful journey, it could be a wonderfully educational journey, or it could be a tough journey in the event that you stand alone with that appetite to learn like that.

0:18:37.0 JH: That’s the reality. Whether you’re in a role that isn’t defined traditionally as a data scientist or data analyst, and you’re trying to spark your journey into that, and the organization hasn’t adopted yet, or you’re in a role that, you’re a data scientist in a larger data science team and the organization is fully invested in it. I think for many organizations, there’s still an education gap of what really is advanced analytics and data science and what are the questions that we need to leverage them to solve for us? How do we ask that question? When do we bring them in?

0:19:15.5 JH: I think that’s a universal continuous thing, and it requires to solve that, it requires again, the term vulnerability, is the vulnerability and the willingness to push the idea forward as you continue to gain your knowledge, continue to gain insight and learnings, bring those up to those in the organization who are the decision-makers, the project owners, whatever it might be as, “Here’s a new way of thinking about this.” Likely, they may have heard of it, probably haven’t heard of what ML or AI actually means, wouldn’t say imposing, but putting that perspective out there, making them aware of it becomes as much of your role as anything, if you want to bring that… Develop that skill set, and bring that impact to your organization, you really need to drive that thinking and drive the mindset shift that it requires to incorporate advanced analytics data science into an organization.

0:20:11.3 ME: So if I’m a C-Suite leader, and I have all kinds of amazing responsibilities that go with my role in the organization, just like your role in the organization, and I’m feeling the pressure to make my numbers, and manage my market, and address the current economic situation, all of the things. And you’re the aspiring data person, and you come to me and say, “Hey, Matthew, I’ve been looking at this stuff. I’ve been studying some things. I have a couple of thoughts.” How would you approach me? What would you say to me? Not that I’m belligerent and stubborn and cranky, but rather I’m just on the move, and I’m looking for concrete chunks, if you will.

0:20:48.2 JH: It’s a great, great thought exercise and an important one. What’s been powerful for me, it’s showcasing… As you call it, showcasing out of the possible, but doing it in comparison to current state. So being able to… Whatever your question is, just for the example here, showing, here’s the report, the current process, the current output, what it looks like now, and I’m delivering that to you, so I’m maintaining my relationship with you. I’m not falling short or anything like that, but I’m taking some of these new learnings, and it takes a time commitment, but passion should drive that, to now, let’s layer in a slice or two of something new on the side of that. Maybe I’m forecasting for next quarter for you. And traditionally, it’s just been… What happened last year, we’re gonna add some percentage to that year over year, and something very simple.

0:21:43.8 JH: And now I’m gonna go in and at its current state when I’m enhancing it, by putting some confidence intervals on it, and giving better scenario analysis around if you do X, we see Y. And start to tell that story of what’s the next level. And it may not be perfect, but you’re at least creating awareness of the capability that you’re developing, and bringing to the organization. And hopefully, through that beginning to create excitement around “Hey, I’m the leader, the executive. I could see the improvement here, let’s dig into that further.” And you start to get the wheel spinning and that progress rolling from that.

0:22:20.6 ME: That’s very tangible. Here’s what we’re currently doing, here’s what we’re using it for, and what it seems to mean to us. Here’s what we could be doing, and here’s how it may actually add additional dimension or insight or view or value. That’s really good, that’s very concrete.

0:22:37.3 JH: It’s powerful, and I’ll say, what can be scary in that, fearful in that is, you have to put yourself out there again. I go back to this just because I’m not the stereotype, IT mindsets or data science mind… Personality, and things like that. But again, it’s not waiting for the business direction sometimes, but just taking a chance and stating, “I think if we did this, this could be the improvement.” And at least starting that conversation. It’s that awareness, that seed of awareness that becomes powerful and that it might not be right, but at least you’re creating visibility to a capability that either exist in your skillset or it can exist, and now starting that conversation.

0:23:24.6 ME: Well, let’s shift it a little bit then. So these companies that are starting to realize, “Hey, we need to be a little more aggressive, a little more assertive about what data, how data, when data. How can we get to where we really want to go, and how do we make this data thing work for us?” But if I’m a company, and I’m looking for people, where am I going to find people? If I don’t have people saying, “I’ve been thinking about this, I want to do this, and I’m starting brand new.” Where am I going to find these folks? Are there data conventions? And you guys are all hanging out like, “Pass the tea. Let’s talk about this.”


0:24:03.5 JH: Candidly, I don’t know if I have a proper answer for that or a great answer for that, other than I think in the space… Data science as much as… We’ve talked about the hard skills of data science, the art of data science, I think the other piece in there to be aware of it’s the subject matter expertise for that organization that becomes essential. You could think of a diagram of this with those three elements in it. That subject matter knowledge becomes essential to really developing impact out of advanced analytics and data science for the organization. I think often for an organization to define success in this, it’s finding individuals that are again, driven by learning, have curiosity, and motivated to learn, preferably in this space, but having in place mechanisms that allow them to ramp up the business knowledge that they bring, that organizational knowledge. What product are you manufacturing in the nuances of manufacturing that product? How does thes sales team sell that product? That business knowledge and the nuances of that are key to success in data science.

0:25:24.0 JH: Using myself as an example, when I turned into an organization, I tried to focus the first few months on just strictly relationship building. Finding that conduit into who are the people that represent the space in the organization, that can become my source of… My vessel of knowledge that I can tap into. Because when I’m working with data and trying to build a model, there’s endless questions around “Do I pull in column A or column B? Do I combine them? Do I create something entirely new? Does this mean anything?” Because what I think is meaningful in the data may be statistically significant, all this kind of stuff, when it actually goes out to the field, and you get feedback and that expert knowledge on, “Well, we actually don’t operate like that, so your insight is meaningless.” If I can get that knowledge, or at least a representation of that, that’s where a lot of power exists, that my underlying skill sets, technical knowledge, storytelling abilities, all that stuff can come together, and leverage that subject matter knowledge. So I don’t know if I answered your question well Matthew, or not, but I think organizations developing pipelines or… Pipelines isn’t the best word… Environments that are conducive to that transfer of knowledge between the subject matter expert, and the…

0:26:38.3 JH: Data scientist, the advanced analytics, and those using the data. That knowledge sharing, I think is where a lot of that power resides.

0:26:47.2 ME: So that’s a way they can discover the value and use and help grow and foster a culture that grows people, but you didn’t yet tell me if there are conventions where there are data scientists like you all sitting in smoking jackets, having tea, discussing the latest algorithms of the breakfast.

0:27:07.4 JH: So those do exist depending on your space and need and so on, right?

0:27:13.6 ME: Right.

0:27:14.9 JH: The term data science is just over a decade old in formality. If I’m remembering correctly, I think it’s credited with originating at LinkedIn as kind of where it started with formerly, and don’t quote me on that. A lot of the build-up and hype to this sort of where we are now with data science… Let me rephrase, not build-up in hype, but growth in this discipline and the rate of growth in this discipline overtime started with the technology companies latching onto researchers that were presenting on neural nets, artificial intelligence, machine learning at their dedicated conferences. So one of the conferences that has been around for decades, is called NIPS N-I-P-S, it’s now NeurIPS is the new term given to it, but it’s all… What was up until a decade ago, a conference attended by maybe a couple of hundred researchers off in kind of the corner, to now it’s annually attended by thousands of people that come to this. That’s where a lot of the original poaching occurred, these researchers brought from academia into practical application data science going forward now. That’s a extreme example.

0:28:37.8 JH: I think there’s many different organizations out there. I think of TWI is one, IIA, Institute of International Analytics, and so on. There’s all these different organizations that, again, to your point, Matthew, it’s maybe not sitting around in smoking jackets and so on, but gatherings of analytics and analytic mindsets that bring a lot of talent together and a lot of skill sets together that can be sources of experienced skill sets, experienced individuals in these resources. And then to give credit to the universities. Again, over the last decade or so, more universities are offering more programs related to business analytics, data analytics and so on. That pipeline is filling up, becoming more robust, becoming more refined as well, and there’s a quality, new grads beginning to come out of universities as more learnings are applied there.

0:29:35.8 ME: It’s a normal, normal problem. So educational institutions are themselves businesses or else they cease to exist. It’s not a free world here, so these folks have the responsibility and the desire and the goal to enable and equip and educate and all of the types of things. A reality though is the gap between learning these concepts to… Even illustrated by your earlier point, go learn about statistical things, whether it’s statistics in and of themselves, probability base, and all of those things. Then learning the tools that are the Python and anything else that makes sense, and then figuring out how to operationalize that and then starting to get into splinters. That’s a journey that has to be lived. Journeys aren’t ordinarily lived in college or university. Journeys can be enabled. The fact that universities are offering more and more data education is outstanding.

0:30:28.9 ME: But it’s fun to see how this is evolving. It’s fun to see where it’s going. To your point, 10 years, thereabouts, plus or minus, plenty of places to go on the web, many conventions to go to, seeing how it’s evolved from a small subset of researchers to a more populated thousands and thousands of people who are interested now. What a wonderful evolution of an idea that we’re getting to watch, unfold right now. And then as far as what does it mean? Heck, that’s part of the whole challenge. What is it? When is it? What does it mean? How we make use of it? This has been a phenomenal conversation with you, good sir. Thank you very much for taking the time to teach us about so very many, just aspects of the journey of data and your journey with data, and even very much thank you for taking the moment to just give some pointers to people who want to learn how to have a journey like the one you’re having. Thank you.

0:31:28.4 JH: Thank you, Matthew, and I couldn’t agree more with those thoughts that are… Right. It’s a great journey that this whole space and discipline is on, and there’s a lot of runway left in it. And because of the uncertainty, there’s a lot of room for creativity and impact to be had as more people venture out and become skilled in this space, as well. So it’s been a… I’ve enjoyed the conversation and learned more about myself and hopefully be able to share some good thoughts as well along the way, so thank you.


Trility Consulting Joins Inc. 5000 Fastest-Growing Companies List

When the founders of Trility realized they needed to form a company instead of individually contracting on extremely tough projects, they knew they’d bring value to clients and provide challenging work to those who joined the team. Little did they realize how quickly the teams and expertise they pulled together would make the Inc. 5000 list in their first year of eligibility.  

“Trility is a team of people always and only working towards one goal: Add the most value to our client experiences possible, moment by moment, each and every engagement. The by-products of that singular goal are revealed as satisfied clients, happy, healthy  teams, and cultural, company, and financial health. It is because of our teammates that we collectively, clients and Trility alike, experience value-based success.”

Matthew D Edwards / CEO

The Inc. 5000 is an annual list that ranks the fastest-growing private companies in America. Trility is eligible as a privately-held U.S.-based company with four years of sales with a minimum of $100,000 revenue in the first year and a minimum of at least $2 million revenue in the most recent year.

Trility opened its doors in 2017 with two Fortune 500 clients and has achieved consistent revenue growth by working with companies of all sizes and across 19 industries. Each of them seeking different solutions but sharing one trait – they view technology as the way to thrive amidst market, economic, regulatory, or competitive headwinds. Despite the challenges of a pandemic, Trility and its clients remained resilient. 

Edwards shared, “We are blessed to have great clients. We are even more blessed to have great people in our company who choose to become more each day than they were the day before. And they repeat this every single day of their journey with our clients and our company. I am proud to stand beside the people at Trility.”

Iowa-Based Companies

Trility joins 31 other Iowa-based companies who also made the Inc. 5000 list: VizyPay, MCI, Higley Industries, The Art of Education University, Pet Parents, Moxie Solar, Trader PhD, English Estates, Eagle Point Solar, Eco Lips, PowerTech, Heritage Group, Itasca Retail Information Systems, Heartland Roofing, Siding and Windows, Spinutech, MedOne, MediRevv, Trility Consulting, Dwolla, Highway Signing, Express Logistics, Schaal Heating and Cooling, Kingland Systems Corporation, JT Logistics, Clickstop, McClure, Peoples Company, Aterra Real Estate, GrapeTree Medical Staffing, Involta, and Ivy Lane Corporation.

Among the 31 Iowa-based companies, the average median three-year growth rate was 140 percent and total revenue reached $823.9 million. Together, this list of companies added more than 7,338 jobs over the past three years and remained competitive within their markets given 2020’s unprecedented challenges. 

About Inc. Media

The world’s most trusted business-media brand, Inc. offers entrepreneurs the knowledge, tools, connections, and community to build great companies. Its Inc. 5000 list, produced every year since 1982, analyzes company data to recognize the fastest-growing privately held businesses in the United States. Complete results of the Inc. 5000, including company profiles and an interactive database that can be sorted by industry, region, and other criteria, can be found at


Podcast, Part II: The Artistry Required for Data Science Wins

Show Highlights

In the second episode of this three-part series, Jacey Heuer helps us dive into the evolving roles and responsibilities of data science. We explore how individuals and organizations can nurture how data is purposefully used and valued within the company.

Missed the first part? Listen to Part I.

Individual Takeaways

  • Adopt a scientific mindset: The more you learn, the more you learn how much more there is to know.
  • Hone storytelling capabilities to engage and build relationships that ensure the lifespan and value of data is woven into the culture.
  • Set one-, five-, and 10-years goals and aim to achieve them in six months to fail fast and advance the work faster than expected.
  • Create buy-in using the minimum viable product (MVP) or proof of concept approaches.
  • Prepare to expand your capabilities based on the maturity and size of the team focused on data science work. As projects develop, you’ll move from experimenting and developing prototypes to developing refined production code.

Organizational Takeaways

  • When your company begins to use data analytics, roles and responsibilities must expand and evolve. Ensure your people have opportunities to grow their capabilities.
  • Data must be treated as an “asset” and viewed as a tool for innovation. It can’t be tacked on at the end. Ideally, it plays a role in both new and legacy systems when aggregating data and capturing digital exhaust.
  • Engage and find common ground with all areas of business by helping them comprehend how data science “expands the size of the pie” rather than take a bigger slice.

Read the Transcript

0:00:05.5 Matthew Edwards: Welcome to the Long Way Around the Barn, where we discuss many of today’s technology adoption and transformation challenges, and explore varied ways to get to your desired outcomes. There’s usually more than one way to achieve your goals. Sometimes the path is simple, sometimes the path is long, expensive, complicated, and/or painful. In this podcast, we explore options and recommended courses of action to get you to where you’re going, now.

0:00:37.3 The Long Way Around the Barn is brought to you by Trility Consulting. For those wanting to defend or extend their market share, Trility simplifies, automates, and secures your world, your way. Learn how you can experience reliable delivery results at

0:00:57.0 ME: In this episode of the Long Way Around the Barn, I picked up where Jacey Heuer and I left off in our first conversation on data science, which has now become a three-part series. Today’s conversation focuses on how both individuals and organizations can leverage data analytics and machine learning, to evolve and mature in their purposeful use of data science.

0:01:22.0 Jacey Heuer: It takes a diligent effort from the data team, the advanced analytics team, to engage with the architects, the developers, those groups, to get your foot in the door, your seat at the table. I think getting to that state means that data is seen as a valuable asset to the organization, and is understood as a tool to drive this evolution into a next stage of growth for many organizations, to achieve those dreams of AI, machine learning and so on, that lie out there.

0:01:57.7 ME: We start by diving into how the various roles fit into today’s data science ecosystem.

0:02:04.8 JH: To the primary roles that I define in a mature team, as it relates to the actual analytics, the data analyst, the data scientist, machine learning engineer, and their MLOps, and what’s becoming a newer term though, taking this further, it’s the notion of a decision scientist.

0:02:24.1 JH: There’s a lot of roots in, you could say, traditional software development in terms of defining, and what is becoming defined for data science, and I’ll say the space of advanced analytics. Generally speaking, not every organization, every team will be structured this way, but I think it’s a good aspirational structure to build into, and it’s the idea of that you have your data scientists and they shouldn’t… The real focus is on prototyping, developing the predictive, prescriptive algorithm, and taking that first shot at that. Then you have this data analyst role, which is really more of the traditional analytics role, where it’s closely tied into the organization, they’re doing a lot of the ad hoc work on, “I want to know why so and so happened. What’s the driver of X?” things like that.

0:03:20.2 JH: So there’s a little bit of predictiveness to it, but it’s a lot of that sort of, “Tell me what happened and help me understand what happened in that role.” And then you could start extending this out, and you start thinking about the machine learning engineer. That’s really taken the step now to go from the data scientist who’s made that prototype, to handing it off to the machine learning engineer, and their role is to now bring that to production, put it into the pipeline. Oftentimes, that may be also handling the productionalizing of the data engineering pipeline or the data pipeline is all right.

0:03:52.7 JH: So being able to go, in a production sense, from the data source, maybe it’s through your data lake, through transformations, and into this model that often it’s written in Python. R and Python are those two languages that dominate the space. Python is often the better language because it’s a general programming language, it integrates well with the more applications, things like that, but R still has its space or its place. I’m partial to R. Nothing wrong with either one.

0:04:23.1 JH: But that machine learning engineer, they’re really tasked with bringing this into production. And then the sort of next step in this is the MLOps. And machine learning engineer falls into that, MLOps, kind of a bigger category, but it’s that role of once that algorithm is in production, it’s up on the mobile phone, it’s up on the progressive web app, it’s being used, now it’s an ongoing process of monitoring that and being able to understand, “Is there drift occurring? Is your accuracy changing? Is performance in that model changing?” This gets into, if you’ve heard of the ROC curve, AUC, and things like that, that monitor performance of that model. And that in itself, can be… Depending on the number of models that have been deployed, can be a task. If you have a few hundred models out there and a changing data environment, there’ll be a need to update, to change, it may be that individual’s task to go in and re-train the model or work with the data scientist again to reprototype a new model.

0:05:31.4 JH: So that’s the general, I’d say the primary roles that I define in a mature team, as it relates to the actual analytics; the data analyst, the data scientist, machine learning engineer and their MLOps. And what’s becoming a newer term though, taking this further, it’s the notion of a decision scientist. This is really the person that is crossing the gap or bridging the gap from, “We’ve implemented or discovered an algorithm, discovered a model that can predict so and so with a high accuracy,” whatever it is. Now their role is to be able to take that and drive the implementation, the buy-in from the business partners to help them make better decisions. So they’re much more of a… Have a foot in both camps of, “I understand the models, I understand the technical side, but I can sell the impact of this and influence the decision that the business partner is making.”

0:06:31.2 ME: What is the name of this role, again?

0:06:34.5 JH: The term that I see for this and I like to give it, it’s decision scientist, is what it is. So it’s much more on the side of really focused on changing, improving the decision and having a tighter role on that side of it, as opposed to what can be more technical, which is the data scientist or machine learning engineer. They’re much more focused on the data, on the programming and so on.

0:07:00.1 JH: And reality of this is, many organizations won’t be at a maturity level to have those distinct functions and roles. And there’s going to be a blend, and it’ll be maybe one or two people that have to span the breadth of that and be able to balance traditional analytics with discovering new algorithms, to productionalizing it, the doing some data engineering, to MLOps, to speaking with the business partners and selling the decision, the new decision, the decision process to them, and so on. And that’s good and bad, obviously. You can overwhelm a small team with that, but you can also find great success in that. There’s a mindset involved in this. I don’t know who to quote this to, but it’s a good mindset that I like. It’s essentially, establish what your one, five, 10-year goals are and try to do it in six months. So you’re probably going to fail, but you’re going to be a lot further along than that person who is trying to walk to those longer-term goals.

0:08:02.0 ME: You’re saying that the larger the organization, the more likely these ideas or behavior classes will be shared across different roles but that then suggests, then small organizations or smaller organizations, one or more people may be wearing more than one hat.

0:08:19.6 JH: I think the better term is more mature data organizations. You could be a small or large organization, but what’s the maturity level of your usage of data, the support of the data needs, data strategy, data management, things like that. Often, it is… Kind of follows a sequence, where it may start with this data analyst role making the initial engagement. A business partner comes to the data team and says, “Hey, we have a desire to understand X better.” The data analyst can go and work on that, develop some initial insights. And out of those insights, that’s where the data scientist can step in and now take those insights and let’s build an algorithm for that. We understand that we reduce price, we drive up quantity, typical price elasticity.

0:09:03.8 JH: We see that in our data, our industry, our market reflects that. Well, let’s go and build an algorithm that can optimize pricing across our 80,000 SKUs. So we build this algorithm and we bring in environmental variables, variables for weather, regional variables, all this kind of stuff and really make this robust. Well, now we need to put it into production. So I hand it off to ML Engineering, they go and build this pipeline, write it in Python, maybe the data scientist worked in R, we do a conversion in the Python, they tie it into a mobile application, so sales reps can have pricing information at their fingertips while they’re having conversations.

0:09:45.6 JH: So now you have the sequence playing out, where again, often in a less mature data group in an organization, that’s going to be one or two people wearing those multiple hats. And if that’s the state, you’re a less mature organization, I think the best approach to it, and it kind of follows the notion of Agile methodology and things like that, but it’s really this MVP notion. The best way to eat an elephant is one bite at a time, is a real concept when you’re trying to grow your maturity of your data team. And let them focus on really developing the different pieces of it and getting it in place before expanding them to have to take on something more. Identify that project that you can get buy-in on, that… Expect to have some value for the organization and go and build that out, to really develop that POC and that first win.

0:10:36.6 ME: That’s interesting. That’s a fun evolution. One of the things we’ve watched change through the years is the idea of information security, regulatory compliance stuff. In days gone by in the software world, there were requirements which turned into designs, which turned into software, which turned into testing, which turned into production stuff, and that’s largely sequential. The serial dependency is going into production so waterfall-y And then as we’ve evolved and rethought the role of testing as everybody’s role and information security is everybody’s role and all of these things, and we introduced continuous integration, continuous delivery, it’s really thrown a lot of things on their head.

0:11:15.9 ME: Nowadays, we’re able to actually attach tools, and granted, sometimes they’re just literally hanging ornaments off trees, but we’re able to attach tools like vulnerability assessment tools, we can write penetration test suites or smoke suites, we can attach them to a pipeline that says, “For every new payload that comes down the line, apply these attributes, characteristics and ideas to it, and make sure that it’s heading in the direction that we all choose.” You can fail the build right there or you can flag it and send a love note to somebody and then you remediate it in a meeting later with coffee.

0:11:54.1 ME: And now, we’re all able to be together in one cross-pollinated team, bring in Infosec on purpose, so design with Infosec in mind, on purpose, from the beginning. And so, acceptance criteria and user stories and epics and all of these things have attributes that says, “For this, these things must exist and these other things can’t exist.” And now information security can be tested during the design, as well as the development continuously, instead of surprising people later like an afterthought, like salting after you’ve grilled the meat, as opposed to before, that type of thing. And even that’s its own religious conversation.

0:12:35.2 ME: With the data stuff, I’m curious. Do you feel like data is being included in… You mentioned Agile, so I’ll talk about scrum teams, delivery teams, strike teams, that type of thing. These cross-pollinated teams composed of developers, designers, human factors, folks, data folks, all of the different types of folks, one team, one priority, one deliverable, one win, one party, that type of thing. Do you feel like the idea of data is being proactively included in the design and development of ideas, or it’s an afterthought, or you’re getting Frankenstein on a regular basis and somehow you have to make magic out of a pile of garbage? How are you seeing things evolve and where do you hope it’s going?

0:13:18.1 JH: The Frankenstein is a good illustration of that. I think, often, data as it is for analytics needs is an afterthought when it comes to application design and development and everything that goes along with that. And a lot of that, I think it’s primarily due to the relative youth of advanced analytics, data science, machine learning, and so on. In reality, the moniker data scientist is maybe a decade old or so, there’s been statisticians and so on before that, and data science is really kind of just the next step down what was that path.

0:14:00.9 JH: So for example, for me, having practiced data science in a number of mature organizations, mature being they’re 90-plus years old or been around for a while and built systems to meet certain requirements, transactional requirements, things like that, and they perform their purpose well, but that purpose wasn’t necessarily with a mindset for, “How can we maybe improve this or leverage the knowledge that can come out of those systems to be applied elsewhere in the business, the data that can come out of that?”

0:14:33.5 JH: And the term I’d give that, it’s these applications are creating data exhaust, to give it a term, where it’s a byproduct, maybe it’s getting stored in a SQL server some place or some database, and maybe there’s some loose reporting being built on it, but it’s probably not easy to go and query, maybe it’s a production database by itself, so if you try to query a lot of it, you’re running into concerns of impacts on performance for the production database and production systems, and so on. And so one of the practices that I’ve been really focused on with this experience is injecting the presence of data science advanced analytics into that application design process, into the design of those new systems, to give a lens into, “What does the algorithm need to be performant? What kind of data do we need? And let’s ensure there’s a thought process behind how that data is being generated, the flexibility to test potentially within that system, how data is being generated and where it’s going, how it’s flowing out, how could it be accessed, how can it be queried?

0:15:53.2 JH: There’s a good example, this is going to be a bit of a technical example, so forgive me for this, but one of the systems in a prior organization I worked with, would move everything in very embedded, complex XMLs was how the ETL process happened. And so from a data science perspective, that’s not an easy thing to shred apart and dig into, to get to all these layers and hierarchies within a super complex XML, but the system performs to its purpose within the organization, and it does what it’s supposed to do. So from that side of it, it’s a great system that works.

0:16:36.0 JH: It’s an old system, but it works. But from the data side, it’s a mess. It causes us to have to Frankenstein things together to try to work with it, was what the outcome was. The idea is evolving, but I think it takes a diligent effort from the data team, the advanced analytics team, to engage with the architects, the developers, those groups to get your foot in the door, your seat at the table, to ensure now, as we go forward and new applications are being built and designed, there’s a mindset for, “What does data science need to be able to leverage this and take us from data exhaust into data gold or data as an asset?”

0:17:19.6 ME: This is a wonderful, wonderful, awesome mess that you’re talking about. We’ve watched the same thing through the years with testing, where it was always test in the arrears, but then people wanted to understand, “Why is the cost of acquisition and cost of ownership so darn high? Why does it hurt so badly to debug software when it’s in production?” Well, test in arrears is the answer, guys. So test-driven, moving testing or quality behaviors as far upstream as possible means consider quality while building, not later. And we’ve watched the same evolution in security, whereby we design with security in mind, as opposed to try and bolt that stuff on later.

0:18:04.6 ME: And that digital exhaust conversation that you’re talking about is a standard problem, even for old school production support people, whereby somebody built some software, they dropped a tarball, threw it over the wall, somebody pumped it on to some old rack and stack hardware in a brick and mortar, and now the developers went home and the infrastructure people have to figure out, “How are we going to make this sucker run?” And then after that, “Why is it broken? Oh gosh, we don’t have log files.” So we have all kinds of challenges through history of no logging, some logging, way too much logging, you’re killing me.

0:18:42.0 ME: And the Infosec people in particular have been on the wrong end of the stick for that and testers were too, where they had to go figure out why, not what, why. Well, hello logging. And Infosec people, they have inconsistent logging, so they trap everything, like they’re the Costco of data, just trying to find any action, so that they can then attach tools and do sifting on it. So we’ve watched software, in particular, change from, “I do my job, now you do your job,” to, “We are doing this job together,” and it sounds like you are smack in the middle of that outstanding, awesome, messy, sometimes painful evolution, which is, “This is a thing, but not enough people understand the value of the thing, so they’ve got us sitting in this room without windows.”

0:19:33.7 JH: Yes, you hit the nail on the head, Matthew. And that ties back into the conversation of roles and so on. If you go back to the development of a software engineering team or Infosec team, cyber security, things like that, we’re getting established, finding how we fit into the organization, depending on… There’s a lot of opinions on this too, right now, in terms of where should advanced analytics data sit within your organization? Do you report up through IT? Do you report up through marketing? Where do you touch? That’s another sort of big question that’s out there.

0:20:12.2 JH: My preference and what I’m coming to understand really works best is to really establish its own pillar in the organization. So the same way that you have marketing, same way you have IT, finance, you have data, having a chief data officer that has a C… And reports up to the CEO and everything underneath of that, that is really when I think getting to that state means that data is seen as a valuable asset to the organization and is understood as a tool to drive this evolution into a next stage of growth for many organizations trying to achieve those dreams of AI, machine learning and so on, that lie out there.

0:20:53.9 ME: A lot of these paradigms might be continually challenged, if not destroyed and re-factored. So the idea of these verticals have, how do I separate data from marketing, from IT, from ops. A lot of those things are HR, Human Resources derived frameworks, but they aren’t delivery frameworks. And so we’ve continued to have this interesting challenge in companies, of, “I have all of these vertically organized people, but they have to deliver horizontally.” So how that gets addressed on the CDO side or embedded or whatever, companies are going to figure that out on their own, they usually do. Although across whole careers, not necessarily on Saturday. An interesting thing you’ve said to me though, although you didn’t really say it like this, it makes me think that the idea of data by design is actually a thing, and that when we’re building systems, when we’re building out epics and user stories and acceptance criteria, the people that are there, the developers, the designers, the data folks, sometimes that gets messy where people think it’s an old kind of a database perspective as opposed to, “What do I actually want to know? What am I actually going to do?” And let that influence the design and the implementation thereafter. Without asking those questions, this is a Frankenstein conversation all day, every day. Data by design needs to become a thing and data needs to be included in strike teams or delivery teams on purpose on a regular basis.

0:22:30.6 JH: The importance of the presence of that knowledge on what’s needed to bring that data to value, to become an asset. So you mentioned asking the question of what do we need and what do we want to know, that really has to come from the data scientist, the advanced analytics team, having a voice in that conversation, to be able to say, “If we’re building an application that is going to provide recommendations for a product to an end user, well, in that application, I need to know potentially what algorithm is going to be applied, how it’s going to be applied, and what does that algorithm need to perform from a data perspective. How easy… Is it going to be a online versus offline learning environment, which essentially the differences between streaming versus batch in terms of how we model and build predictions. What does that mean? What is that going to take? Do I need certain REST APIs built in to access data in some way, or is it going to be a batch dump overnight, into the data lake for us to build something on?”

0:23:34.7 JH: All those questions really need to be designed and have a perspective from a data scientist or an engineer that has knowledge of the data science requirements, the process, and preferably it would be the joining of those two together to allow them to work and bridge that gap. But it’s in… The success that I’ve had in driving those conversations, it’s been, “How do you get creative in trying to convince people that doing so expands the size of the pie and doesn’t just take a bigger slice of the pie for me or for you?” So finding that benefit, that software developer, that systems architect, whoever you’re working with, engaging them in a conversation in a way that lets them see the benefit to them, from a data science perspective, so that you get that buy-in because I know now, with their support, my life’s going to be easier because I’m going to get the data, the access that I need to build a stronger model, a more robust model.

0:24:36.0 ME: One of the other interesting things that you said I’d like to amplify is, you talked about how in some environments where the idea of analytics wasn’t taken into consideration in advance, you end up having to go find out if data exists at all and if it does exist, in what state is it captured, if at all and is it fragmented, dirty, is it sporadic, what do you have available to you, and what state is it in? You have to do that before you can even decide, “Okay, here’s the problem we want to solve, here are the things we need to know, here are the desired outcomes, or the things we want to decide along the journey. So I need this data. What’s in the system already?”

0:25:19.4 ME: So that impacts people’s perceptions of the adoption velocity of data people too, I would think. In other words, somebody says, “Dude, all I want to know is… What I want to know what’s taken you so darn long.” And your answer is, “But you never looked at this before, so we don’t collect all of the data. Some of the data we do collect is in 700 repos spread out across… Who knows? Time and space, and most of it’s dirty. So before I can even get to my job, I have to find the data, clean the data, get the data, and then get people to re-factor stuff.” That makes it look like you guys are slow. So how do you handle that? What kind of experience are you having there?

0:26:04.9 JH: Yeah, so that is… Directly ties into the power of storytelling. The power of storytelling of the journey, not waiting until we have, “Here’s a shiny object, we built it and let’s show that,” but showing the journey that we’re on to get to that object, that output and so on. because you’re right, the reality is that often, the mindset from those requesting the insight is, “There’s got to be an easy button. You’re a data scientist, we have data, just click your button, hit your mouse and tell me my answer.” In many ways, those questions that are being asked of us are all in themselves mini-innovations, because they’re not standard run-of-the-mill questions. It’s…

0:26:53.8 JH: You captured it well, Matthew, in terms of, “We’ve got to go and find this data, clean the data, experiment, iterate on those experiments, potentially bring it to production, whatever, build an interface for it to be consumed” and so on. And so it’s important to be vulnerable and honest with that journey and educate those stakeholders on, “This is the reality of the current state, what we’re working with. We’ve dug into… You came to us with your question, we’ve gone out and did our initial assessment exploration, this is the current landscape that we have and because of that, this is going to be the roadmap, the timeline to achieve what we need, and we’ll engage with you as we go forward.”

0:27:39.0 JH: “We have a weekly, biweekly, whatever that time frame is, dialogue with you to update on progression, pivot and iterate and so on.” But it’s that storytelling that is essential. Going on a bit of a tangent here, that’s… I think, in terms of resources to go and educate and become a data scientist that are out there, those programs do great at learning the technical side of data science, but it’s that relationship, the storytelling side, again, that is as critical as any ability to write an algorithm, to program in Python and so on. How do you inform of what it takes, give transparency to that process, to build that relationship with your business partners, is essential.

0:28:30.5 ME: That makes sense. So the storytelling and the relationships. And it sounds like really, leadership needs to have an understanding of the value and need for analytics to start with, but then they need to have an additional understanding of, it needs to be data by design. And so, you could be walking into a legacy house and you need to figure out how to retrofit. Well, that’s going to have a slower adoption velocity than if I was starting with a brand new system, zero code on a blank screen and I can do data by design. And so the relationship, the communication, the story, that’s probably a pinnacle part of your entire existence, which is communicate.

0:29:11.2 JH: It is, and a good framework for it, that I think can help that story, it’s one, it’s positioning as… Often, it’s a capability, you’re developing a new capability for the organization, which is advanced analytics, assuming you’re not mature, it’s a different state. But that capability building, there’s really four pillars to that. It’s people, process, technology, and governance is kind of what I put into that. And so how do you, within those four pillars again, of people, process, technology and governance, what do you need to accomplish within those pillars? What gaps do you have? And tell the story around that. How do I go and resource this properly? Is it a data issue? Is it a application design issue? Is it a… We don’t have the right question coming from the business? We can’t answer that. This is a better question. Within that building of the capability, put the story together and I think that becomes useful to that dialogue, that relationship building with the business partners.

0:30:14.3 ME: As the idea of data, data science, data analytics is evolving, as its own body of knowledge, its own set of practices, you’re actually doing software development in Python and R. That being said, even though your output includes mathing, lots of it, the reality is, you’re delivering software in some way, shape or form that needs to be integrated into a larger ecosystem of some sort. So different question for you. Based on your experiences and the things that you’ve seen and just the general industry, given that it’s actually a software engineering craft, in addition to all of the wonderful analytical math and algorithm, all the things that you’re doing, do you feel like the data science industry itself recognizes that they are software developers, and therefore they also need to be pursuing software craftsmanship?

0:31:11.7 JH: Yes, mostly. That’s…

0:31:15.4 ME: I realize that was meaty. But anytime somebody says, “I build software,” we need to build reliable software, and that requires lots of good engineering practices.

0:31:26.4 JH: It does, right? So it’s a great question, and the reason I say yes, mostly, is because this relates back to the notion of the different roles and disciplines, data scientists, machine learning, engineering and so on, but I follow this as well. I’ll say, I came into this discipline from the statistics side, and not from the software engineering development side. And being vulnerable here, being candid, it shows in the way I write code. So it’s very much I write code for experiment and iteration and prototyping in that data science mindset. And you’re right, what’s needed though when you take that into production, you need quality code meets the Python style guide, stuff like that, commented well, if you believe in commenting, all that kind of stuff.

0:32:16.8 JH: That’s where that software development really comes into play. And I think the reality is, there’s probably a bit of a mismatch in skills there, if you can… But I think it’s evolving and becoming more refined as we go forward. There is a skill set difference between those two, even from the standpoint of… As we develop and leverage things like GitHub and code repositories and stuff like that, and everything that goes along with software, software engineering, software development, that’s a growing… Has a growing presence on the data science side as well, the collaboration of algorithm, coding and building a notebook, all that kind of stuff. So it’s a great question, but I would say it’s still predominantly kind of an experiment, prototyping side, and then… How do you refine that into well-written production code, on the other side of that.

0:33:16.6 ME: It’s an evolution for everybody. Even historical hardware-based, the rack and stack, brick and mortar, data center type folks, the infrastructure type folks, the people that were historically doing those types… Those focused operational behaviors, that world has changed out from under them as well, where we’ve moved into cloud engineering, and if I can have a 100% software to find everything, then that means all of a sudden, software developers can actually define all of their own infrastructure and networking and failover and all of the rubbish. But at the same time, now, the infrastructure folks actually need to become software developers. So we’re watching lots of amazing and awesome things change, and the data world is just another lovely facet of how we’re evolving, building things that are useful to us. Really, ultimately, you just have to figure out like we all are, is, “What problem are we trying to solve? What are the desired outcomes and what are the things that are necessary to get from there to there?” and then design it and do it in such a way, and especially attitudinally, be willing to change.

0:34:30.2 ME: “I am going to break something. I’m not as smart as I think that I am, and I have to be reminded daily,” and I do get reminded. It’s just an evolutionary thing. I think this journey that you’re on is phenomenal, and it’s not because you have all the answers, it’s because you don’t. That’s what makes it phenomenal. And I think people miss that, when they consider iterative development or iterative change, is, “It’s okay, tomorrow, I’m going to be plus one.” Is that where you think your industry is, is absolutely, plus one? Are you thinking you’re 10X daily like, “Dude, we have a long way to go”?

0:35:12.2 JH: No, I like the way you kind of illustrate that, Matthew. And what’s in that, I think, is most valuable there, it’s the realization that we don’t know everything, and the participants in the room don’t know everything. I think when you’re pursuing, whether it’s a data science objective, whatever it is, having that understanding that we’re all learning, is as valuable as anything, and allows for… I’ve used this term a few times, vulnerability to be present and to be comfortable with that, where I don’t know everything there is to be known about topic X, you may know more than me, but let’s be open about that and build our knowledge collectively, again, expand the size of our pie, as opposed to one of us taking a bigger slice, is I think, an important mindset to have, not only in building and maturing data science, advanced analytics, but in whatever you’re taking on is essential, the scientific mindset. Really, the understanding that once you realize that, you know enough to know that I don’t know, that is a good state to be in.

0:36:28.1 ME: There’s the interesting pure science of this whole conversation, the creation of and evolution of an idea, and then there’s the operational science of this idea, which is, “This business has allocated a million dollars to this project, and it has some amazing set of features that need to exist, that serve these users and these industries, and there’s a definition of done, desired outcomes and all that,” there’s a box. And so somehow, you have this amazing challenge of telling a story that makes the idea of data, where it is in its life span, and the value of data, as it relates to this business and project come to life for somebody to say, “Yep, we should be doing this for sure.” But then you have to figure out how to get inside this existing, moving organism as well, which is, “We build stuff, we move it into production, we generate revenue, serve clients, make them all smile.” You’re building a plane and flying it at the same time, and even though this isn’t a Zoom video for people that don’t know, we do Zoom so that we can interact with each other in video. Jacey, you’re still smiling this whole time like, “Yeah, this is a bunch of crazy, and I love every second of it.”

0:37:43.6 JH: Yes, it’s enjoying the journey, enjoying the grind, whatever term you want to give to it, is essential for, again, not just the path I’m on or you’re on, Matthew, whatever it is, falling in love with that journey and the chaos of that, and the opportunity to learn within that space. My personality, I’m driven by learning. If I see this as an opportunity to learn, that’s what motivates me to go and pursue it and take that on, and data science, advanced analytics, this whole discipline space is rich with that. It’s learning every day. For me, it’s learning a new algorithm, a new mathematical concept, a new development idea. How to integrate, move into a cloud environment. That’s a whole other beast in itself, as all the services of cloud and transforming from on-prem to cloud and everything goes along with that. So the space for learning is vast, that’s exciting, and it should be.

0:38:50.8 ME: So as we start to wrap up, I wondered if we could get your viewpoint on the idea of data and all of the roles, just example, the roles that you’ve talked about, they may or may not exist in all of the different companies or all of the different HR frameworks or whatever it is we want to talk about, and the value of data and when data and how data and where to include them, and when should it… The front… Did you do it in arrears? Am I good with Frankenstein? Why is… What’s my adoption velocity? Why did it cost so much money just to get this data? What is going… That crazy, crazy mess. If someone is going to say, “Hey, I want to figure out what data analytics is, and I want to figure out how I can leverage these things to evolve my company,” how do people figure out where to start? Is there a clean answer or is it context-driven? Is it just always messy?

0:39:44.8 JH: My perspective on it, it starts with understanding, “What are the desires of the organization?” Obviously, “Are we developing a new product? What’s our strategy look like?” All that kind of stuff, in terms of that vision going forward. And from that, it’s understanding, “What’s the current data landscape look like?” And that’s a beast in itself, in defining that. But it’s really getting your mind around that as a starting point, can often inform, “What are we capable of? What can we do now? And who or what resources do we need to level up and move forward?”

0:40:25.0 JH: As poor as this can sound, I think oftentimes, companies like to just jump to, “Let’s get a data scientist, they’ll solve it.” Well, the data scientist comes in, if they don’t have the data to work on, they’re just kind of floating out there, trying to figure that out or missing that piece. And so, data as a foundation and working on that, I don’t think it’s ever solved, but focusing on that, building it so it becomes a true resource and not just exhaust, that is… That’s, I think, the initial, essential key focus to launch off of. And in that, it may be a combination of data science and data engineering coming together, whatever that is, but I think, in my perspective, that foundation of building a strong, robust data environment is essential to any success that can come out of that, come out of the venture and the path into advanced analytics, machine learning, AI, and so on.

0:41:25.2 ME: If you don’t know what you want to know, or you don’t know where you want to be after this effort has happened, adding a data scientist isn’t going to change anything other than your budget, your run rate, but it’s not going to change your outcomes. So, it’s kind of like, you shouldn’t ever go to the grocery store on an empty stomach and you should know why you’re going there before you walk in, or don’t send me. That’s the net. You really need to know where you want to be, or else don’t just hire somebody.

0:41:55.7 JH: From a data science perspective, hearing the terms, “Go and discover something for me in the data” is often a little cringe-worthy. because then it’s a… You need that objective, I need to know, “Am I trying to make lasagna? So this is the ingredients I have to go get from the grocery store to make lasagna.” Sending us on that, just a wild goose chase, to say, “Go and find X millions of dollars in the data.” It’s possible, but it may not be super probable. But having an objective, “We’re trying to solve this question, this business problem,” then now we have something concrete to anchor around, to go look for in the data and build this for a purpose and objective and so on.

0:42:38.9 ME: Well, I think we ought to go explore some more of these subjects together. So for today, what I want to say is thank you, and I look forward to talking with you again real soon.

0:42:49.8 JH: Thank you, Matthew, I appreciate it.

0:42:55.0 Speaker 2: The Long Way Around the Barn is brought to you by Trility Consulting, where Matthew serves as the CEO and President. If you need to find a more simple, reliable path to achieve your desired outcomes, visit

0:43:11.4 ME: To my listeners, thank you for staying with us. I hope you’re able to take what you heard today and apply it in your context, so that you’re able to realize the predictable, repeatable outcomes you desire for you, your teams, company and clients. Thank you.


Never Leave ‘Em Guessing

The world according to Melissa Creger means you’re never left guessing. As someone who values and practices transparency, she has a history of earning trust, being flexible, keeping an open mind (and ears) so her thinking is always challenged. 

“I don’t ever want to guess how I’m doing, and so I never want my clients to be left guessing either,” shared Creger, whose career has historically been one of connecting and helping people. She got her first taste of “digital transformation” during her work with the Alzheimer’s Association when the organization realized it needed to move from pen and paper to online signups and payments.

“This experience gave me insight into how technology is critical for sustainable growth,” she added. This past experience with technology as a user, coupled with her tenure working in the technology industry since 2012, made her a great fit for Trility Consulting when she joined the team as a Director of Business Development in July.

“Melissa values transparency and seeks to set clear expectations when managing relationships, and the by-product of that approach is trust,” said Brody Deren, Chief Strategy Officer. “We are excited to have her join the team and support our growing client partnerships in Omaha and beyond.”

Trility’s outcome-based delivery method means clients receive observations, recommendations, and options to iterate for the best, highest-priority outcome. Creger will help build upon this proven approach and ensure we continue to deliver over and over again on our promises – meeting time, budget, and scope that aligns with business and technical requirements. 

Connect with Melissa

Interested in learning more about Trility, email or connect with Melissa Creger on LinkedIn.  

About Trility 

Comprised of technologists and business consultants, Trility helps organizations of all sizes achieve business and technology outcomes. Clients appreciate that our teams solve problems contextually and bring their people along to ensure a reduced cost of ownership long after the engagement is done. Areas of focus include:

  • Cloud and DevOps
  • Product Design and Development
  • Information Security
  • Data Strategy and Management
  • Internet of Things (IoT)
  • Operational Modernization

Trility is the only business and technology firm with a proven history of reliable delivery results for companies that want to defend or extend their market share in an era of rapid disruption. Headquartered in Des Moines, Iowa, with teams in Omaha, Neb., Kansas City, Mo., Denver, Colo., our people live everywhere, and we serve clients from all corners of the United States and globally.


Podcast, Part I: Vulnerable Storytelling to Advance Data Science

Show Highlights

You wouldn’t think a data scientist would tout vulnerability and storytelling as requirements for success, but that is exactly what Jacey Heuer has learned across multiple industries and projects that have failed and succeeded. In the first of this three-part series, Heuer shares that “what you think you know today should change tomorrow because you’re always discovering something more.”

Key Takeaways

Success in data science means:

  • Acknowledging that 80% of projects never make it out of production, and not because of a failure of science but a failure in communication and being vulnerable. 
  • Putting yourself out there by connecting with different people. 
  • Acquiring and honing new skills and behaviors that support a deeper understanding of systems thinking and the dynamic variables within those systems.
  • Always iterating and reinventing. The work is never done, and it’s never easy.

Three distinctions for roles and responsibilities:

  • Data Analysts work with stakeholders in-depth to understand the problems, goals, and outcomes needed.
  • Data Scientists focus on prototyping and exploring and twisting and turning data – looking for the algorithm.
  • Machine Learning Engineers productionalize the output.

Read the Transcript

0:00:57.9 Matthew: On this episode of The Long Way Around The Barn, we kick off a three-part series with Jacey Heuer, a data scientist with a passion for learning, a passion for teaching, and an unquenchable passion for helping leaders understand the profound impacts of data-based decisions. I absolutely loved my conversations with Jacey, and was surprised and highly interested when he told me how vulnerability and storytelling were two of the greatest attributes of a useful data scientist. In these podcasts, Jacey shares with us a little about his personal and professional journey as a data scientist.

0:01:37.6 Jacey Heuer: And what I feel today might change tomorrow, and so on. What’s sort of the core component of that is the scientific thought process. I’m not going get too far ahead, but that’s something that connects with me deeply. Part of the reason I’m a data scientist is this: Your vision, what you think you know today should change tomorrow, because you’re always discovering something more. That’s the scientific process.

0:02:00.8 Matthew: His views on the development of data science as a body of knowledge and professional practice, how companies can realize the value of data decisions, and what people need to explore, learn and pursue in order to become a credible data scientist. JC, thank you for taking the time to meet with us, talk with us, teach us and just include us. Tell us a little bit about… We know currently that you’re working in the data space on purpose. You love it, it’s a passion, it’s your journey, it’s your current chapter or multiple chapters, but tell us a little bit about your journey, Where have you been? Where have you come from? How did you end up here? And then tell us about where you are and where you’d like to be heading. Teach us about you.

0:02:50.3 Jacey Heuer: Thanks for having me, Matthew, I appreciate it. And I liked the emphasis on purpose there. So my journey started… I’ll go way back to start with maybe, right? So I started off as an athlete, very focused on athletics. Coming through high school into my undergrad, I was gonna play professional basketball. So I’m a pretty tall guy, relatively athletic, depending who you talk to. And so that was really my initial journey. Various reasons it didn’t pan out. I ended up graduating and getting my undergrad, and finance is kinda where I started. And so there’s a lot of connection into data with finance, accounting, stuff like that. It’s not a stretch by any means, to get to the data side of that discipline. I started off in financial analytics, and then decided to go back and get my MBA. And so I was getting my MBA at Iowa State around the time that data science was really becoming more of a mainstream term. It was noted as being the sexiest job of the decade and all that kinda stuff. Around this time is when it was first getting popular. And so that was kind of my initial motivation, to be like, “Yes, I like finance.” I’m getting this sort of data bug as I step out into the professional world.

0:04:14.1 Jacey Heuer: Going through my MBA course at Iowa State, I was introduced to some text analytics classes and courses, which is really sort of my first real step into what I would call real data science, kinda that movement beyond traditional business intelligence, financial analytics, stuff like that. So, got some exposure there out of that. I started to really focus on “What is this career path that I want, where do I want to go, and how do I do this within this data science space?” So I started networking, as sort of cliche as that can be, just getting my name out there, meeting people, stepping out, being vulnerable, putting myself out there, connecting with different people, and I was able to take a role in data analytics with commercial real estate, which is… There’s some traditional applications of that. There’s also some… From when I was looking for a data science sort of transformative application. That was a new thing in commercial real estate at the time, and it’s still a relatively new thing. That industry is relatively data-tight; data is held close to the chest, it’s not publicly available all the time. And there’s ways to go around that and all that kinda stuff, but that was sort of my first big opportunity and big step into this journey of data science.

0:05:30.6 Jacey Heuer: And so I was able to finish my MBA, start this role with this commercial real estate company, leading their international commercial real estate research publication. So we’re doing analytics on Europe, on Australia, on the US, similar countries around the world, understanding different forecasts around interest rates, around metro markets, all this kinda stuff, drivers of hotness in the commercial real estate industry across these metros and things like that. That was sort of my real first taste of a data science professional setting. I’m really diving into this knee-deep. From there, this was kind of in tune with when more universities were now starting to catch up and launch their graduate programs around data science, so I decided to go back, earn my graduate degree in data science. Out of that, it was just kind of a launch pad to keep moving forward then. And I’ve always had this kind of notion in my mind, as I’ve gone down this journey is, there’s currently this double-edged sword of, how often should you change? Should you take an opportunity? And how long should you stay in that current role before you feel like you’ve learned? And… What’s that balance of, “Am I going too fast? Am I going fast enough?”

0:06:46.7 Jacey Heuer: And to me, I’ve landed on that side of trying to… As mystical as this can sound, listen to the universe; not give too much thought to it and just kinda let it flow. So when an opportunity comes along, it’s an assessment of, “Does this really feel right to me? If it does, let’s take it.” That’s given me the ability to practice and step into data science and work in the data space across a few different industries. So as I’ve gone forward, I’ve worked in… I mentioned commercial real estate, financial services, e-commerce, now manufacturing, the energy industry as well, and been able to experience, really, different company dynamics, different sizes of companies, and how they approach data, data science, data management. What the nuances of changing a culture to be more open, to being data-driven, what does that mean? What are the challenges of that? And that’s really been what’s led me to this state, and I think what’s kinda guiding me forward as well. It’s listening to the universe, listening to the flow, accepting kinda what comes next, and then just kinda moving forward with that. If that makes sense, hopefully, but…

0:07:58.8 Matthew: No, that’s outstanding. One of the things that struck me, and you may already be aware of this pattern, and I’m just catching up to you. In order to be an athlete on purpose, you have to be aware of a universe level or a system level, whole system level, set of variables, and all of these variables in the system are dynamic. Some of them might be static, some of them are variable. And all of these things are learning new skills, honing existing skills, deciding to try and make some things, some behaviors, some quirks, some types of behaviors go away. But your goal was to take all of these system variables, understand these variables in the mix, and move forward in some way, shape, or form. Whether you tacitly recognized it at the time or not, it seems like, as a purposed, goal-oriented athlete, you were already a systems thinker. What’s interesting then is how you translated that systems thinking into another, more… Well, defined for undergraduate school degree, finance, which was also systems thinking, also structure. Did you do that on purpose? Did you discover it along the way? That’s an interesting map from my perspective, right off the bat.

0:09:22.2 Jacey Heuer: I would say that wasn’t on purpose by any means. It was more of a, “This is my personality, this is sort of this… ” Again, I… Not to sound mystical, but it’s sort of that sense of, “This just seems to fit as the next step, and let’s take this, and put myself out there and see what happens.” I think you hit the nail on the head, Matthew, when you talk about that systems thinking from an athlete’s perspective. It’s having that sort of top to bottom, bottom to top, thorough understanding of: How does the team work? How do the pieces come together? What’s that more macro vision, that strategy that we’re going after, and how do we deliver that strategy within these sort of subcomponents? And something I’ve noticed, as I’ve gone further in my career with data science… There’s… And I think this is… It’s common across many disciplines, many practices, there’s sort of the balance of… Those with the ability to really… To be the… To have that real depth of technical skill set, and can knock out, “This is my task, I can do that task,” and those with the ability to really see what’s the relationship with that task into the bigger whole and connect these pieces together. And I’ll say, from a data science perspective, the skill set to really understand, “How does this algorithm, this thing I’m working on, tie in to that business impact, tie in to the bigger whole?” That’s a valuable skill set to have.

0:10:56.3 Jacey Heuer: And I’ll say, for me, having both an MBA, data science master’s degrees, and putting those two together has given that sort of benefit where I can understand how, if I’m building this algorithm, writing this code, what’s the impact to the business? And how do you speak to that impact to build those relationships with those that are ultimately going to adopt this output? That’s the feedback that we want, that we’re seeking, and why a common statistic for data science is that it’s something like 80% of models and algorithms never make it to production. That’s a huge failure rate. And a lot of that is, you’ll do all the legwork, the foundational work, getting it up to that state, and then go to that last mile to get adoption, you don’t get that buy-in from the business; that relationship isn’t there, that trust isn’t there. And that’s something where, on the athlete’s side, as a basketball player, you know if that’s gonna happen, more immediate. You know if I’m taking the shot or I’m passing the ball to this person, they’re either gonna take it and shoot it and score or not. You know that they’re accepting your pass. You know it’s gonna happen. Data science side, it may not be evident or obvious right away. You may go through all this work, three months down the line, just to find out that what you were building doesn’t get adopted, and it falls into this abyss of what could have been data science.

0:12:26.9 Matthew: That map, from your bachelor’s degree in finance to then doing an MBA to get a broader perspective, it almost looks like a funnel, as I’m visualizing some of your journey, where the athlete himself was starting out as a systems thinker, so that’s already a wide funnel, if you will. And then finance was starting to apply structure and discipline, and honing some of that stuff, but just raw talent’s not enough to be a pro ball player. Just raw talent gets you down the road, but it doesn’t help you last. So somewhere along the way, you said, “I must focus, I must have structure, I must have purpose.” Somewhere, you chose that. To your point, listen to what you’re hearing and make decisions contextually, but you became aware of the need for doing something on purpose, and thinking about all of the variables, you moved into the MBA conversation with a data focus. The interesting thing about the MBA, from my perspective, is it’s not designed to give you the answer to all possible questions, but it is designed to make you aware of how very many different bodies of knowledge exist to just even make an operation operational and then healthy and useful.

0:13:44.9 Matthew: So you have this interesting blend between you want to be a competitor, a high-performing competitor, who is disciplined, to someone who’s now focused it to, “I understand math, I understand models, I understand the value proposition of an idea,” to then moving into, “Hey, there’s all of these things it takes to run a business, not just data stuff. But data helps drive, equip, enable, educate people to make decisions, but there’s all these other things as well. They all require data, but they’re all different types of behaviors.” You’ve walked into this data role, being aware of the need for systems thinking, of discipline, knowing that you’re not the only person in the company with a brain doing thinking, but then also realizing that the things that you’re creating need to be relevant to all of the other people in the business, or else it inadvertently supports that 80% of all models never make it to production. 80% of all shots taken never making it into the basket; that would be a fairly brutal statistic as a pro ball player. So in the data industry, that seems like some people are getting a lot of forgiveness, if you don’t mind my… What I’m saying there fairly directly is, 80% as an industry number? That’s pretty tough, dude. What are your thoughts on that?

0:15:09.4 Jacey Heuer: Yes, you hit the nail on the head, Matthew. And I think the mindset with data science, with AI… On one side, there’s a lot of buzzword, a lot of media coverage of it that drives a lot of it, and while the media coverage can be hyperbole sometime, the foundations of it are real. And the reality is that I think a lot of organizations, a lotta industries want to jump to, “Let’s just throw an algorithm at it, let’s just throw machine learning at it, and it’ll work,” without really realizing that the foundations, the data foundations underneath of that, the quality of that data, the governance of that data, the culture around managing that data, that is what drives the success of those 20% of models that get into production. It’s coming from having robust foundations in your data.

0:16:06.9 Jacey Heuer: And that’s probably the biggest distinction there, is that… Any model, any analytics that you’re doing, really, is a small set that, once that data foundation is in place, it’s much easier to iterate, experiment, prove value to your business partners, your stakeholders, and have a shorter putt to get to that adoption, and push through the end zone with that, and that, I think, is what gets lost in that 80% that doesn’t make it to production. As much as part of that’s maybe because of the relationship with the business, well, that relationship struggles because of the complexities that you’re trying to go through on the data side, and any of the confusion around “Why is it taking so long? Why can’t you just push the easy button?”, all that kinda stuff comes with that sort of messiness in the underlying data. Does that makes sense?

0:17:05.5 Matthew: It’s the sausage-making conversation, right? Have you ever been to a product demo? Many people have. Have you ever been to a product demo where all of the technical people said all of the technical things, but the people that were paying for the product development didn’t understand a single word that was spoken, like, “I know you said things. You seemed very excited about them. You seem confident. That makes me confident. I still have no idea what I just bought.” That seems like an easy gap that could exist in the data science world, to the executive leadership world inside a company, for example. For all of the executive leaders out there who are making decisions based on a single pane of glass, or a dashboard, or they’ve got a lovely, lovely, dynamic Excel spreadsheet with wonderful graphics on one of the pages in the workbook. For people that are trying to distill a whole business down to a single pane of glass, they may or may not be interested in the sausage-making. So how have you found, given all of your background and your awareness of these situations, how do you bridge this gap between, “I’ve got this data science stuff,” and “These guys are just looking for pie graphs”? How do you become relevant when they’re only using a single pane of glass?

0:18:26.3 Jacey Heuer: Yes. And that is, in many ways, the core of the challenge, that’s the art. And really, it comes from… It’s the relationship building, it’s the conversations, it’s the honesty around the vulnerability of letting these stakeholders know, “If we want to step forward into becoming truly more data-driven, changing the way we think about our decision making, our leveraging, and turn data as an asset, data as a resource and so on, what does that mean?” The reality of it is, you need to find that balance between that single pane of glass and the guts of making that sausage, and you have to pull back the covers a little bit on that, and the term I use, it’s the art of the possible. The being able to set the stage of, “This is the art of the possible, this is what we can do, if we have the strong foundation underneath of it.” And starting at that, “Here’s the shiny object, and now let’s peel it back and dig further into this and make that journey known, of what’s needed to get to that vision and art of the possible, and now let’s go and resource and attack these sort of sub-components that let us get that far.” And that takes clear communication and vulnerability.

0:19:46.4 Jacey Heuer: Again, I use that term a lot, because there’s no easy button for data science, for AI, for ML. As much as companies and vendors will push, “This is auto-ML, you can point and click,” all that kind of stuff. There’s a lot of work that goes in underneath of that, to make that work and work well for changing a business, changing the way they operate. Again, it’s giving that kind of clear vision of, “What can we bring, from a data science in advance and Linux perspective, to the organization?” and then laying out in honest terms, “These are the steps that we need to take, where the gaps are and how we can start tackling that.” Because it’s that vision that can hook someone and then going on that journey on, “How do you fill in those gaps, to get to that?” that’s the key, and making the partnership known.

0:20:43.7 Matthew: So set expectations, manage expectations, and in all cases, communicate and over-communicate.

0:20:51.1 Jacey Heuer: Correct. Iterate and iterate.

0:20:51.5 Matthew: And iterate.

0:20:56.0 Jacey Heuer: One of the key things I like to do when I enter into an organization, it’s go around and have these data science road shows. So meeting with different groups, different departments, and just educating them upfront, on, “This is the data science thought process, the data science project process. And what does that mean? And how is that different from maybe traditional software development or traditional engineering and things like that?” The data science means experimentation, means iteration, means going down a path, learning something, and then having to go back three steps and do it again. And so, it’s not a linear process all the time, but it’s very circular and it’s very iterative. And even when we get to the end of that path, we produce something. That thing we produced, may need to be re-invented a couple of months later, or you launch an algorithm and a pandemic hits, and what was driving that algorithm no longer has as much meaning because of the new environment. So you have to go and re-build that algorithm again and re-launch it again, because there’s new information being fed into it.

0:22:04.0 Matthew: There’s an interesting parallel inside organizations, which I imagine you’ve already seen and noticed because of your bachelor’s and your master’s. The idea of financial modeling, modeling itself and forecasting, whether it’s a go get a brand new vertical market, whether it’s segment a market, it’s create a new product and create demand for the product. The idea of finance has been around for a long time, and it’s understood by most, it’s discussed in undergrad and grad school, and even if people don’t go to university of any kind, everybody is familiar with, “You need to make more money than you spend, or else you’re upside down, you have a problem, you won’t last long.” But if I want to live for a very long time, I need to forecast. In other words, I need to say, “Based on the things I know today and the things I think I know about tomorrow, what will it take for me to get from where I am to where I need to go?”

0:22:53.0 Matthew: That forecasting idea, that’s an old idea, and it’s in companies already, today. And I’ve seen it done wonderfully and I’ve seen it done horribly, and the difference was communication, where somebody took the time to say, “Look, man, based on these 15 assumptions and these 17 system variables, which I don’t control any of them, and based on the things you think you want to be when you grow up, 19 months from now, here is version A, B and C of my forecast,” and people tend to accept that as, “Okay, given all of the knowns and the unknowns, this makes a lot of sense. You made me feel good. Okay, goodbye.” In the data space, it seems to be similar, but I wonder if that’s just a new enough idea that people don’t understand what they’re buying yet or how to use it yet, and so when you mentioned that, “Let’s just grab some MLs, let’s just grab some AI, let’s just grab that little algorithm and put it into my Excel spreadsheet,” I wonder if people don’t fully understand exactly what it is, what to do with it and how to make best use of it right now.

0:24:02.9 Jacey Heuer: I think you’re correct in every aspect of that. It’s sort of the shining light on a hill, shining object that’s sort of lingering out there, that I want to grab on to, and it sounds great, it sounds cool. And again, and not to discount it, it is ML, AI is real, the expected benefits of it are real, the readiness for some organizations to really adopt it, may not be as real. And I think that’s a key concept to keep in mind. Depending on the organization, there can be a lot of ingrained processes, ingrained mindsets. I’m going to look at the data, to justify or justify a position I already have. The confirmation bias. I already know what I want to find out, I’m gonna go find it in the data.

0:24:53.2 Jacey Heuer: So if I apply a ML model to that and it tells me something different, I’m not gonna trust that, because I have… I know what I already think, and that’s what I want. That’s one of the walls that, as we build data science into an organization, how do we tear that wall down and change that mindset to overcome that confirmation bias, the selection bias that may be present? And it may be built on years of experience, “This has worked for me for 30 years. Why would I change now?” Well, there’s more data becoming available, the industry may be changing, the environment’s changing, we’re in a pandemic, we’re in whatever it is, that’s the promise of data science, is, it’s quicker, more consistent, in many ways, more accurate decision-making that can come out of those models, those efforts.

0:25:48.4 Matthew: It seems like, to me, based on my own journey, based on the increasing numbers or classes of data that we continue to collect, that we didn’t use to collect, when we collect so much more data today than we ever did, and it’s only increasing, that at some point, the idea of a super smart financial controller or CFO being able to take in all of this multi-dimensional data and make sense of it in order to create a credible forecast, it seems like the role of the manual forecast will become less and less and less reliable, as the multiple dimensions of data that we collect continues to increase and not even at the same rates of speed. My guess is, is that we’ll just be in denial about the reliability in our ability to forecast multiple dimensions in Excel, as opposed to recognize that, “Hey, I want to do the same thing, but now with all of this data, maybe I need to go figure out what this ML thing is, or what is this AI thing, or… ” It just seems like the magic of the forecaster needs to change.

0:27:00.8 Jacey Heuer: What I think of, when you mentioned that, Matthew… I don’t know if I’d call it the magic of the forecaster, the mindset needs to change, maybe. It’s the base skill sets that go into this, go into forecasting, go into modeling, it’s the understanding of, “As I obtain more data and try to translate that into an action, translate that into conversation that a leader can take an action on, what are the skill sets I need, to be able to make that translation happen?” Because the data, the ML, the algorithm, as companies become more refined, more robust in their ability to build that foundation of data, that will continue to improve and become, I think, easier to get to, “This is my forecast, and it’s a more robust forecast because I’m taking in so many more variables, many more features into this forecast, and I can account for having an expectation of different anomalies and things like that to occur.” But my role as a forecaster now, has to be, “How do I translate that into meaningful action for the business and tell that story and convince the leaders of that action?”

0:28:17.0 Jacey Heuer: And I think that’s something where, academically… And there’s many boot camps and things out there, that build the technical skill set for data science, but what’s still catching up, is that communication, it’s that relationship building, it’s, “How do I tell the story in a way that’s actionable and that drives trust in my forecast, in what I’m doing?”

0:28:41.9 Matthew: In my mind, at least, it is similar to the technical people who demo a technical, they say technical things during the product demo, but somehow, they’re completely irrelevant to the people that are supposed to be benefiting from that whole journey, ’cause I didn’t say anything that mapped. Let me tell you about your five-year goals say this, your current books say this, your forecast says this, we’ve aggregated this data. After we take that data and look at it multi-dimensionally and we forecast it out differently, you have to take all of this giant universe of stuff and not talk about it and distill it down to something that’s just plain relevant. In other words, what I think I’ve heard you say so far is, you could be the smartest data scientist in the earth, and if you don’t have the ability to communicate, you’re in that 80%.

0:29:36.4 Jacey Heuer: Yes, you hit the nail on the head, Matthew. That’s the key right now, it’s that communication, I think, that drives a lot of that adoption. There’s pockets of, I think, industry spaces where that may not be as necessary. I think of, if a company is founded around data and data is at the core of their organization, I think of a start-up, think of any… Put your tech company in here. Generally speaking, I think they have a stronger data culture, because their product is data. But when you’re talking about many other industries that are out there, manufacturing, energy, in many ways, things like that, where it’s… You’re stepping into a legacy company, a company that may be 100 years old, and it’s going through this transition to become data-driven, that’s where a lot of that challenge, and even more so, the emphasis on that communication becomes pertinent to the success, to changing that 80% failure rate to 50%, to majority of these are getting implemented. That’s where, at least in my experience, having worked in those industries that have some of these legacy old companies, that’s a key to success, is that communication, that relationship building.

0:30:57.8 Matthew: So, that 80%, really, may more accurately reflect just an inability or a lack of success in setting and managing expectations and communicating. It’s not a failure of science, it’s just a failure of us being people. Being a person is hard and communicating is hard, it’s the science where we can find peace.

0:31:21.0 Jacey Heuer: Yes. Right. To put it another way, the art is what’s hard, the science is straightforward. I know the math, I know the linear algebra, all that kind of stuff, and that’s the way it is right now, as far as we know. But it’s the art of, now, translating that into something meaningful. That’s a big component of it.

0:31:47.5 Matthew: So I’m… John, I haven’t done the things that you do, and I’m not even intending to assert that I know all of the things that you do. If I’m able to start in a greenfield project, that I’m able to do all of the things the way I think they should be done and anything that doesn’t happen as it showed, is on me. Often times though, to your point, we end up in legacy situations, where the company is 100 years old, 140 years old, or it’s been under the leadership of a particular C-suite for the last 45 years, whatever, in all of those situations, that does represent, probably, growth, it represents constancy or continuity, it represents a good strong company, all of the things. But it also represents the way things are done, and it might also then, be an additional challenge. So for me, if I need to take all of the data in an enterprise and take that all together and meld it together and do a single pane of glass for a C-suite for them to say, “Aah, I can now make a decision.”

0:32:42.3 Matthew: The journey to get to that lovely single pane of glass, like Star Trek, just walk around, hold it in my hand and I can see the entire stinking ship on that one screen, it’s ridiculous. ‘Cause I can have 105 different repos, data repos out there in various states of hellacious dirty data, to, “Oh my gosh, just flush this stuff,” to, “That is gorgeous. Where did that come from?” to stuff that’s in data prisons, the stuff that’s outside the walls. In the worlds that I’ve walked, to get to that single pane of glass, that journey is not peace, it’s just a lot of stinking work. But what’s it like, for you?

0:33:22.6 Jacey Heuer: I chuckled a little bit at that, because it’s chaos in many ways. That’s the reality of it. Because especially these old mature companies, generally, I don’t want to put a blanket statement out there, but just given what I’ve worked in, and there’s nothing… It’s just the reality of it. It’s the way they’ve gotten here, they’ve been around… The company may have been around for 100 years. They found success somehow, to be here for 100 years. But the result of it can be, from a data perspective, that you have many different systems, applications generating data, data that’s… It’s not built for data science, it’s maybe built for reporting, it’s… Term I give it is, data exhaust. It’s just not really in a usable format, and there’s knowledge gaps. There may not be… The person that built the database may not be with the company anymore, or still using the database, but no one has any real knowledge of what’s in there. There’s data flowing into it, but how do we map it and get it out? Things like that.

0:34:25.0 Jacey Heuer: And the path that has been useful, in trying to work through that, drive a transformation into something more modern, more updated, more usable for data science, it’s finding those champions within the owners of that data. So where that data is owned, going out, and again, it’s back to communication, it’s back to the art, but its finding those champions and not to get too granular on this, but something that’s worked for us is, it’s working to establish a true data council, data stewardship, where you have this representation, where you have, instead of data being this by-product, this… It doesn’t have a forefront, a key role in the business, it now takes a step in the forefront. The ownership is established, and the connection to the goals of the organization are built out. So now I have this council of individuals representing the different parts of the business that are generating the data, and they have a voice in, “How is this being used?” and have transparency and clarity into, “This is how we would like to use it.” Well, the conversation started, “Then, well, this is what we can do. I didn’t know that. That’s interesting.”

0:35:44.8 Jacey Heuer: You start that communication through that council, through that stewardship program, that is the first step to getting to that foundation of a robust data layer. Now you can build that data science on top of… Build that AI and ML on top of… And start that transformation. What can be, I think, challenging in that, depending on the goals of the organization, it’s the time and resources needed to really do that, and that’s a mountain to climb in itself, is, “How do you convince of that story, that this is what we need to get to that next step with data science, AI, ML, all that kind of stuff?” That’s a journey in itself.

0:36:29.7 Matthew: Do you find, in your profession, that you’re asked or expected to, or you find the need to differentiate or define what is data science, what is machine learning, what is artificial… Do you have to differentiate these things, and how would you define that for us today, knowing full well that you may have broader and deeper things to say, than we’re all prepared to receive?

0:36:53.3 JC Heuer: I think of it this way. It’s not uncommon. Anything that’s new, there’s a fair number of examples out there, where three different people, you ask them to define something, they can have three different definitions of it. What does this mean to you? And it’s the same thing with the data science space. The way I break it down is in a couple of ways. On one level, in terms of data science and data analytics, it really falls into three categories. There’s sort of the diagnostic, descriptive sort of category pillar, which is, many companies will have some version of this, where maybe we have a SQL server, we can do some reporting, maybe we visualize it in Power BI or Tableau, we can see what happened. That’s really that sort of descriptive diagnostic.

0:37:43.7 Jacey Heuer: The predictive element, that’s where we’re taking that sort of understanding of the past and now, giving some expectation of what’s to come, we’re guiding your decision on what we think is going to happen. Putting some balance on that, confidence interval, things like that. And then the third element or pillar, is the prescriptive pillar. This is where we’re taking those predictions, now giving that recommendation. What’s the action that we think will happen, because of our understanding of the data of the environment? If we tweak this lever or turn this knob, we can drive some outcome, and that’s our prescriptive recommendations. We’re gonna decrease price 10%, we’re gonna increase quantities sold 30%, elasticity.

0:38:29.3 Jacey Heuer: That’s kind of at a high level, how I start to define that, is those three pillars. And when you step into specific roles, you think data scientist, data analyst, machine learning engineer, data engineer, decision scientist is out there now, there’s all these different roles and variants that are beginning to evolve, in it’s many ways. You think back 20, 30 years ago, with software development and sort of that path of defining more niche roles and areas of that discipline, data science and the data space is going through that. The key difference goes to, I think about defining data analysts, data scientists and machine learning engineer. I think those are three important roles to understand in the space. And data analyst is very much on the side of, “I’m working with the business stakeholders to understand a particular problem in-depth and sort of lay the ground, the landscape of, this is what we have in the data and how maybe we can help answer some of that.” A lot of it’s in that descriptive side of those three pillars I mentioned.

0:39:41.2 Jacey Heuer: Data science, that’s really that algorithm building. It’s the prototyping, it’s the experimentation, it’s going out and we’re taking this chunk of data, adding more data to it, doing clustering on it, doing segmentation, exploring this in any great depth in perspectives and twisting and turning it. And we’re trying to find that algorithm, that mathematical equation, where you can input data and get an output that gives us a prediction or some prescriptive action. That’s data science. And the machine learning engineer, that machine learning engineer, that’s who’s productionalizing that data science output. So now you have data analysts that are defining and understanding. Data science, building of an understanding, that, “Let’s put this into an algorithm.” The machine learning, taking an algorithm and putting it into production. Those are three distinctions that I think, get misunderstood, but are important to understand, from a leadership standpoint, from the design of, “What do I need to do data science?” Those are skill sets that are essential for success with this.

0:40:44.4 Matthew: What’s interesting to me though, is how you’re differentiating the data scientist from the machine learning person or ML ops, and that it sounds like when you were talking about the data scientist, this sounded like a software developer to some extent, to me, or a developer, which is, I’m taking this idea and I’m building it into a real thing. Then there’s these other folks that they take it out and move it into the wild, and that’s an interesting thing to me, because often times in the software development space, the people, there’s the business analyst who may have contributed to the definition of done or the direction, then there’s the folks that are building the thing. But often times those folks that build the thing are the same folks that have to move it out into the ether and then live with it and support it and evolve it. So are you suggesting that is not the same thing in the data space?

0:41:36.1 Jacey Heuer: I think you’re tracking with me, Matthew. I think you got it right. With the data side of it, a lot of it is because of that iteration, and sort of the, I don’t want to say burden, but the role of having to integrate this back into the software development process and manage that integration and maintain model performance. So you think of… I think of… If I’m building an application that… I’m gonna build a web app, for example. In many ways, I can build it, put it out there and it lives. There’s quality testing, things like that, but the application I built, is pretty well-defined, serves its purpose. If I’m building a machine learning algorithm and putting them into production, once that’s put out into production, it’s not the last version of that, that will exist.

0:42:28.3 Jacey Heuer: And so, the infrastructure, to be able to monitor that, maintain that, score that model, understand drift in that model. So what I mean by that is, monitor it for, “This used to be 90% accurate, now it’s 50% accurate. Well, what happened?” So, that’s the importance of this machine learning engineering and ML ops side of this, it’s taking that off the plate of the data scientist who’s focused on, “Let’s prototype this, let’s go and explore this world of data that’s out there and keep iterating on this,” and let the ML ops, ML engineering, tie this into software development, into the applications that exists in the organization, into the rest of the IT space, within the organization. That’s probably the key distinction there, and why it’s slightly different, I think, from the data side than what it might be in the software development side, if that makes sense.

0:43:22.6 Matthew: These things sound actually very amazing, JC. Basically, I’m gonna have to cycle on this a little bit, because at first, I thought you were saying, the data scientist is like a developer, but then that developer typically has to go and live with the things and iterate on those things. Whereas, it seems like you’re suggesting these guys are going to invent, create, evolve, but then someone else was gonna move it into the ether. So that makes it almost sound like one version of the word architect in the software world, which has its own loaded… English is hard. Quality, what does that mean to 10 different people? Cloud, what does that mean to 10 different people? Same thing.

0:44:02.2 Matthew: Here’s what I’d like to do, because our time is coming to a close for today. I don’t think we’re anywhere close to talking about a lot of the even more interesting things. For example, you being a practitioner. How would you advise, coach, encourage, teach or lead other people to introduce data? The whole point of data, data science, data management end of their organization. What are those steps? What does it look like? What is good communication? I’d like to talk to you some more and I’d like to do that in our next session together. So, we’ll save some of it for the next time, but first and foremost, I wanted to thank you for taking this time to teach us.

0:44:41.1 Jacey Heuer: Thank you for having me here today, and we’re just scratching the surface on this, and I’m excited to continue the conversation and go from there.

0:44:54.0 The Long Way Around the Barn is brought to you by Trility Consulting, where Matthew serves as the CEO and President. If you need to find a more simple, reliable path to achieve your desired outcomes, visit

0:45:10.3 Matthew: To my listeners, thank you for staying with us. I hope you’re able to take what you’ve heard today, and apply it in your context, so that you’re able to realize the predictable repeatable outcomes you desire for you, your teams, company, and clients. Thank you.


The Long Way Around the Barn

There is usually more than one way to achieve your goals. Sometimes, the path to the goal is longer than it needs to be because we are all challenged with similar things: We often see what we know or see what we want to see. 

In this podcast, we look for options and recommended courses of action to get you to your desired outcomes now.