09.06.23
Perspectives from the front
The Data & Trust Alliance, with support from IBM, offers a new series of perspectives with practical insights from businesses earning trust with data and AI.
The era of data and AI has arrived — and with it, practical challenges for the world’s businesses. Taking a wait-and-see approach? You risk missing a historic opportunity to spark innovation. Worse, you may get disrupted by more agile competitors. On the other hand, letting a new technology proliferate haphazardly within your company poses real dangers — both of operational inefficiency and of reputational, and even legal, exposure.
In a new series of perspectives, the Data & Trust Alliance and IBM offer insights from businesses on the front lines of this new era — enterprises that are creating new value and earning trust with data and AI.
Data quality, AI performance and trust
With respect to AI, data quality generally refers to the accuracy, completeness, consistency, timeliness, uniqueness, and validity of a given data set, as well as the data’s fitness for the purpose for which it is being used. But the practical meaning of “high-quality data” in any given context will depend on the needs of the organization and the specific use case involved. Download the Data Quality Perspective here.
Protecting individuals and enterprises through algorithmic safety
Algorithmic safety refers to processes that ensure algorithms don’t produce biased or harmful results. Algorithmic safety practices include evaluating the quality of training data, ensuring that the algorithms are appropriate for the context and purpose for which they’re used, and providing education and training for the people who build and use AI tools. Download the Algorithmic Safety Perspective here.
The urgency of AI governance
AI governance is a system of rules, practices, processes and tools that help an organization use AI in alignment with its values and strategies, address compliance requirements and drive trustworthy performance. Download the AI Governance Perspective here.