Over 10 years we help companies reach their financial and branding goals. Engitech is a values-driven technology agency dedicated.

Gallery

Contacts

411 University St, Seattle, USA

engitech@oceanthemes.net

+1 -800-456-478-23

Technology Webinars

Webinar Recording: Testing LLMs: Building Trust and Reducing Risk with Microsoft & KJR

Technology Webinars

If you missed the Microsoft x KJR webinar “Testing LLMs: Building Trust and Reducing Risk, the recording is now available:

David Waxman, Microsoft’s World-Wide Director of Industry Accelerators, Dr Kelvin Ross, Founder of KJR, and Andrew Hammond, KJR ACT General Manager, discussed the capabilities of Microsoft’s Public Sector Accelerator for ChatGPT and KJR’s journey to implement and test it, building trust while reducing risk.

Microsoft Public Sector Information Assistant

Microsoft Azure’s ‘Public Sector Information Assistant (Info Assist) brings forth a revolutionary platform designed to empower Public Sector Institutions with the capabilities of ChatGPT v3 or v4. 

Built to leverage Azure OpenAI, the Info Assistant provides a conversational interface which is set to redefine how users interact with their data. The platform leverages the power of generative AI large language models to let Public Sector employees or customers have meaningful conversations with their data.  

The Assistant Interface 

The primary goal of the Info Assist is to provide precise, reliable, and current information to Public Sector employees’ and customers’ inquiries, based on insights contained within their own dataset. The Info Assist was developed using a Retrieval Augmented Generation (RAG) pattern and it responds to prompts by drawing on the data contained within Info Assist One of the key elements of building trust in the responses provided by the Info Assist provides context and citations, offering transparency into the origin of the information.   

Examples of citations 

The potential applications of the Info Assist are diverse and are only limited by the data that is ingested into the platform, from technical inquiries by employees to students seeking tailored information about their study programs.    

Testing the Information Assistant

With over 25 years’ experience in software quality assurance and risk management, KJR has extensive knowledge in delivering government IT services. Our goal is to ensure government technology functions seamlessly, remains compliant, and serves the best interests of Australian citizens.

In recognition of KJR’s expertise in software testing and our keen interest in building trusted AI systems, Microsoft extended an invitation to partake in the beta testing of their cutting-edge tool.

With KJR’s keen focus on quality, accuracy and delivering trusted systems, KJR used its Validation Drive Machine Learning (VDML) methodology which focuses on building and validating machine learning models. VDML is a comprehensive framework specifically designed for the accuracy and reliability challenges employing Machine Learning (ML). It guides organisations in the development of robust and reliable ML models to deliver AI solutions. 

The webinar dived into the highs and lows of our testing experience for Microsoft and shed light on the approach we adopted to evaluate this innovative tool.

This collaboration highlights Microsoft’s recognition of KJR’s extensive expertise in software quality assurance, entrusting us with the crucial responsibility of testing a platform poised to refine data interaction

Dr Kelvin Ross highlighted trust can be built into a LLM (Large Language Model) by implementing a number of steps, including continuous testing and evaluation, aimed to evaluate the LLM’s performance on an ongoing basis. Also, adopting the VDML methodology will help ensure the reliability and trustworthiness of the LLM. 

Read more on KJR’s testing of Microsoft platform here: https://kjr.com.au/kjr-chosen-to-test-microsoft-azures-public-sector-information-assistant/      

The benefits of using your own data with an LLM

There are numerous benefits for businesses having their own LLM (Large Language Model). It can bring: 

1. Assistance for employees or customers: The LLM can assist various stakeholders in the company by offering the information and support they need. For example, the LLM can serve as a powerful tool for helping the sales team prepare bids and for the HR team and employees to look up policies. Companies can tailor the model to their specific domain, industry, or use case, ensuring that it understands the nuances of their business and can provide more accurate and relevant insights or services. 

2. Competitive Advantage: Having a proprietary LLM can provide a significant competitive advantage by offering unique capabilities or services that competitors don’t have access to. 

3. Innovation Accelerator: Building an LLM allows companies to innovate and develop new applications, products, or services powered by natural language understanding, potentially opening up new revenue streams or improving existing ones. 

4. Cost Efficiency: Having its own LLM is more cost-effective in the long run than relying on third-party solutions, especially for companies with large-scale language processing needs. 

And more! 

 

Companies can integrate their LLM directly into their existing systems, workflows, and applications, streamlining processes and improving efficiency. Companies can maintain full control over the data used to train the model, ensuring data privacy and security compliance. 

Overall, building their own large language model gives companies greater flexibility, control, and potential for innovation in leveraging natural language processing technology to drive business success.

 

Contact us today to discuss how to implement your own Large Language Model and access the URL for the Github Repo of the Info Assistant: Contacts – KJR