Part ChatGPT 4o
ChatGPT 4o API demo
from openai import OpenAI
model_use = "gpt-4o-2024-08-06"
client = OpenAI(api_key="Your-API-key")
completion = client.beta.chat.completions.parse(
model=model_use,
messages=[
{"role": "system", "content": "Extract the event information."},
{"role": "user", "content": "Alice and Bob are going to a science fair on Friday."},
],
)
event = completion.choices[0].message.parsed
Note: I tried to use model “gpt-4o” but failed.
How to create ChatGPT API Key
- Log in to openai
- Use the search bar to search “API keys”
- Create a new secret key (Shown only once, invisible after closing the tab)
- Go to billing to add some credit to the account
Part UniTox ChatGPT
Read from fda.gov
-
Read the label of the drug we are interested in from a .csv file.
-
Read the .html file or the .pdf file on the page
-
Create a summary of the .html and .pdf files by ChatGPT
-
Use the summary generated by ChatGPT to let ChatGPT decide whether the drug is toxic or not and how toxic the drug is.
Initial Prompt: Provide a summary of all the parts of the drug label that discuss cardiotoxicity risks and cardiotoxic reactions for this drug. In your summary of each sentence, clearly state whether the drug itself was associated with or caused the cardiotoxicity risk. Output1 Toxidity Score Prompt: Given the above information about a drug, answer 'was this drug associated with No Cardiotoxicity, Less Cardiotoxicity, or Most Cardiotoxicity?' Now, answer with just one word: No, Less or Most. Output1 (Summary) OUtput2 Toxidity Test Prompt: Given the above information about a drug, answer 'was this drug associated with Cardiotoxicity?' Now, answer with just one word: Yes or No. Output1 Output3 <-> compare GT

Part Llama
Part Ollama
First open an ollama server on the server:
ml ollama # activate ollama
ollama serve # open ollama server
To use ollama in python: (demo)
pip install ollama
from ollama import chat, Client, ChatResponse
client = Client(host='http://localhost:11434')
model_use = "llama3.2"
completion = client.chat(
model=model_use,
messages=[
{"role": "system", "content": "Extract the event information."},
{"role": "user", "content": "Alice and Bob are going to a science fair on Friday."},
],
)
completion['message']['content']