Visual grounding
Using the source input as ground truth will help trust the system and makes it easy to interpret its process and what might have gone wrong.
When checking data, I want to be able to see how the system arrived at its answer, so I can trust the data and identify any potential errors in the process.
- AI Transparency and Explainability: Make AI systems transparent and understandable by explaining how and why decisions are made.
- Multimodal Context: In this example we used the context of an image of a receipt, but it can also include other modalities, such as audio.
More of the Witlist
Presenting multiple outputs helps users explore and identify their preferences and provides valuable insights into their choices, even enabling user feedback for model improvement.
Starting with a blank canvas can be intimidating, but providing prompt starters can help individuals overcome this initial hurdle and jumpstart their creativity.
A smart browser assistant that understands the context of your open tabs to offer relevant suggestions and actions, enhancing productivity through transparency and control.
When an observation is added to the context from an implicit action and a prediction is made, users should be able to easily evaluate and dismiss it.
Comprehend and compare large documents by visualizing embeddings and their scores, enabling a clear and concise understanding of vast data sources in a single, intuitive visualization.
Based on your selection and situation, context menus can help you discover actions and access them quickly.