How This Physician Trains His Own AI Assistants
By Alisa Pierce

HIT_web

Before putting his custom artificial intelligence (AI) assistants to work, Dallas pediatrician Joseph Schneider, MD, starts with a conversation, talking through the tasks he wants the technology to tackle, much as he would with a human assistant.

The habit of having a dialogue is what makes a custom AI assistant effective – and it’s a habit he wants more physicians to embrace.

That message was at the center of a CME-eligible session Dr. Schneider co-presented at TexMed April 18 with Dallas neonatal-perinatal medicine physician Jawahar Jagarapu, MD, exploring how physicians can safely use AI assistants.

For Dr. Schneider, AI's value in medicine hinges on how well physicians understand how to design, train, and use the technology, as well as the practical, ethical, and legal limitations of the AI assistant at hand.

As a past chair of the Texas Medical Association’s Committee on Health Information Technology and Augmented Intelligence, he said he is “very comfortable” exploring emerging systems – but he understands not every physician feels the same.

That’s why physicians “need more education on the subject,” said Dr. Schneider, an assistant professor of pediatrics and clinical informatics at UT Southwestern Medical School.

“If physicians understand how AI works, know its limitations, and recognize the possible practice efficiencies that will come from it, they’ll feel less hesitant about integrating it within their workflows,” he told Texas Medicine Today.

Unlike generic AI tools like ChatGPT that have a universal use, custom assistants can be tailored to a particular function in a physician’s practice, learning the relevant administrative and clinical policies. They can also be trained to a physician’s personal preferences, such as travel requirements. The better the training, the more accurate the AI assistant will be. But Dr. Schneider cautions AI still can make things up, so physicians should always be careful. 

A “starting point” for Dr. Schneider was creating a custom AI assistant to automate the process of evaluating multiple AI vendors, using TMA’s AI vendor evaluation tool as a basic framework. After the guided conversation to train his AI evaluator tool, it responds with questions about: ​

  • Practice size and specialty;
  • Clinical relevance and intended use​;
  • Data handling and privacy​;
  • Regulatory status and validation​;
  • Implementation feasibility and integration​; and
  • Transparency, explainability, and monitoring​.

As Dr. Schneider responds, the tool generates an evaluation summary that highlights strengths, weaknesses, risks, and key follow-up questions for vendors.​ The assistant not only saves that information for later use but also gives recommendations on how to vet other AI platforms – and asks how Dr. Schneider would want to use the data going forward.

Even with AI’s potential, Dr. Schneider cautions physicians to maintain appropriate boundaries when using AI​. His AI evaluator tool was built using ChatGPT, for example, and is not HIPAA-compliant, so he never enters protected health information into it.

HIPAA mandates physicians must take certain steps to protect the privacy and security of patients’ data, including when using AI. Additionally, new Texas laws require physicians to disclose AI use to patients when using it for diagnosis and treatment or when using an AI tool that is “intended to interact” with a patient receiving a health care service or treatment.

“Play around with creating AI assistants and learn some of the basics,” Dr. Schneider said. “These assistants can help with staff onboarding, policy management, information reviews, patient handouts, and much more – but remember that you’re still responsible for understanding and reviewing how it works in your practice. Just like a human assistant, it’s not perfect.”

His advice to physicians: “Catch the AI wave when it's small, not when it's a tsunami.”

For more information, TMA offers its members free resources within its artificial and augmented intelligence resource hub.

DISCLAIMER NOTICE: This information is provided as general information and is not intended to provide advice on any specific legal matter. This information should NOT be considered legal advice and receipt of it does not create an attorney-client relationship. This is not a substitute for the advice of an attorney. The Office of the General Counsel of the Texas Medical Association (TMA) provides this information with the express understanding that (1) no attorney-client relationship exists, (2) neither TMA nor its attorneys are engaged in providing legal advice, and (3) the information is of a general character. Although TMA has attempted to present materials that are accurate and useful, some material may be outdated, and TMA shall not be liable to anyone for any inaccuracy, error or omission, regardless of cause, or for any damages resulting therefrom. You should not rely on this information when dealing with personal legal matters; rather legal advice from retained legal counsel should be sought. Also, this information contains third-party links that may bring you to a third party website, owned and operated by an independent party over which TMA has no control (3rd Party Website). Any link you make to or from the 3rd Party Website will be at your own risk.

Last Updated On

May 07, 2026

Originally Published On

May 07, 2026

Alisa Pierce

Reporter, Division of Communications and Marketing

(512) 370-1469
Alisa Pierce

Alisa Pierce is a reporter for Texas Medicine. After graduating from Texas State University, she worked in local news, covering state politics, public health, and education. Alongside her news writing, Alisa covered up-and-coming artists in Central Texas and abroad as a music journalist. As a Texas native, she enjoys capturing the landscape on her film camera while hiking her way across the Lonestar State.

More stories by Alisa Pierce