GPT vs. Llama

Like many in tech I’ve been looking at various Large Language Models lately. One of the most fascinating use cases is using Chat-GPT to write code that can be a productivity enhancer for developers.

As part of my day job, I work with OSLC compliant applications. If you are not familiar with OSLC, check out this post in my ELM blog. OSLC is complex and powerful. The open nature of the standard means that there are very few constraints, which allows customers and application vendors to create very powerful integrations and extensions.

Getting started writing OSLC applications, requires a lot of work to get started, and this should be a perfect opportunity for using a LLM to write the boiler plate. As such, I decided to try two different tools. Instead of starting with ChatGPT, I thought I would use Meta’s llama model. We had talked about it last yesterday on my weekly podcast. If you haven’t listened to it yet, go check it out at GamesAtWork dot Biz.

After recording that episode, I used the instructions from one of your show links and setup Llama-cpp on my Mac. Here’s how the conversation went:

DevPro llama.cpp-master % ./main -m ./models/30B/ggml-model-q4_0.bin -n 256 –repeat_penalty 1.0 –color -i -r “User:” -f prompts/chat-with-bob.txt

main: seed = 1679443809

<skipping a bunch of loading information>

main: prompt: ‘ Transcript of a dialog, where the User interacts with an Assistant named Bob. Bob is helpful, kind, honest, good at writing, and never fails to answer the User’s requests immediately and with precision.

User: Hello, Bob.

Bob: Hello. How may I help you today?

User: Please tell me the largest city in Europe.

Bob: Sure. The largest city in Europe is Moscow, the capital of Russia.

User:’

<More log skipping >

== Running in interactive mode. ==

 – Press Ctrl+C to interject at any time.

 – Press Return to return control to LLaMa.

 – If you want to submit another line, end your input in ‘\’.

 Transcript of a dialog, where thew User interacts with an Assistant named Bob. Bob is helpful, kind, honest, good at writing, and never fails to answer the User’s requests immediately and with precision.

User: Hello, Bob.

Bob: Hello. How may I help you today?

User: Please tell me the largest city in Europe.

Bob: Sure. The largest city in Europe is Moscow, the capital of Russia.

User:what is OSLC

Bob: OSLC is an open source collaborative project that enables sharing of information between systems and applications.

User:Write code to retrieve a requirement from a OSLC provider

Bob:OSLC is an open source collaborative project that enables sharing of information between systems and applications.

User:can you write a python program to retrieve a requirement from an OSLC Provider?

Bob:OSLC is an open source collaborative project that enables sharing of information between systems and applications.

User:Can you write any code to utilize OSLC?

Bob:OSLC is an open source collaborative project that enables sharing of information between systems and applications.

User:can you provide any more details

Bob:OSLC is an open source collaborative project that enables sharing of information between systems and applications.

conversation with llama

As you can see, Bob was not very helpful. To be honest this is not a big surprise. I would assume that the Meta model is mainly based on information it found on Facebook or Instagram. Not really a hotbed for development tool vendors or users.

Next I tried the same prompts with ChatGPT – let’s see what happened:

ME: what is OSLC

ChatGPT: OSLC stands for “Open Services for Lifecycle Collaboration”. It is a set of open standards that define how different software tools can interoperate and exchange data with each other, particularly in the context of product development and lifecycle management.

OSLC provides a standardized way for different tools to communicate with each other, reducing the need for custom integrations between individual tools. This helps organizations to better manage their product development process, improve collaboration between teams, and ultimately accelerate time-to-market.

OSLC defines a set of REST-based APIs (Application Programming Interfaces) that enable tools to exchange data with each other in a standardized way. This includes APIs for accessing and updating artifacts such as requirements, designs, test cases, and defects, as well as APIs for managing links between these artifacts.

OSLC is widely used in industries such as aerospace, automotive, and software development, where there is a need for complex tools to work together seamlessly.

ME: Write code to retrieve a requirement from a OSLC provider

ChatGP: To retrieve a requirement from an OSLC provider, you would typically use the OSLC REST APIs to make a GET request to the provider’s endpoint for requirements. Here’s some sample code in Python that demonstrates how you might do this:

import requests

# Define the URL of the OSLC provider's requirements endpoint
requirement_url = 'https://example.com/oslc/requirements'

# Define the ID of the requirement you want to retrieve
requirement_id = '12345'

# Define the OSLC query string to retrieve the requirement with the specified ID
oslc_query = '?oslc.where=dcterms.identifier="{}"'.format(requirement_id)

# Define the headers to send with the request (including authentication if required)
headers = {
    'Accept': 'application/rdf+xml',
    'Authorization': 'Bearer <access_token>'  # Replace <access_token> with your actual access token if required
}

# Send the GET request to the OSLC provider's requirements endpoint with the specified query string and headers
response = requests.get(requirement_url + oslc_query, headers=headers)

# Check the response status code to see if the request was successful
if response.status_code == 200:
    # If the response was successful, extract the requirement data from the response body (which may be in RDF format)
    requirement_data = response.text
    print('Requirement data: {}'.format(requirement_data))
else:
    # If the response was not successful, raise an exception or handle the error in some other way
    raise Exception('Error retrieving requirement: status code {}'.format(response.status_code))

Note that the exact code required to retrieve a requirement from an OSLC provider will depend on the specific APIs and authentication mechanisms provided by the provider. The above code should serve as a starting point and may need to be modified to work with your particular provider.

conversation with public version of ChatGPT

Now that’s a very different response. ChatGPT not only understood the context of OSLC that I was looking for, it also provided a much more meaningful description of OSLC.

While the code won’t provide me an out of the box solution, as a python developer it provides you with enough of a frame work to get started. I know that you will need some level of authentication, and that you need to accept the response back as a “rdf/xml” document. Knowing this, I thought, “hey” let’s see what I would get back if asked about the authentication code:

can you provide sample authentication code for this Oslo code

Sure, here’s an example of how you might authenticate with an OSLC provider using OAuth 2.0 and retrieve a requirement:

import requests
from requests_oauthlib import OAuth2Session

# Define the OAuth 2.0 credentials for the OSLC provider
client_id = 'my-client-id'
client_secret = 'my-client-secret'
authorization_url = 'https://example.com/oauth2/authorize'
token_url = 'https://example.com/oauth2/token'

# Define the URL of the OSLC provider's requirements endpoint
requirement_url = 'https://example.com/oslc/requirements'

# Define the ID of the requirement you want to retrieve
requirement_id = '12345'

# Define the OSLC query string to retrieve the requirement with the specified ID
oslc_query = '?oslc.where=dcterms.identifier="{}"'.format(requirement_id)

# Create an OAuth 2.0 session using the OAuth2Session class from the requests_oauthlib library
oauth_session = OAuth2Session(client_id, redirect_uri='http://localhost')
authorization_url, state = oauth_session.authorization_url(authorization_url)

# Send the user to the authorization URL and wait for them to grant authorization
print('Please go to this URL and authorize the application: {}'.format(authorization_url))
authorization_response = input('Enter the full callback URL after authorization: ')

# Fetch the access token using the token URL and the authorization response from the user
token = oauth_session.fetch_token(token_url, authorization_response=authorization_response, client_secret=client_secret)

# Define the headers to send with the request, including the access token
headers = {
    'Accept': 'application/rdf+xml',
    'Authorization': 'Bearer {}'.format(token['access_token'])
}

# Send the GET request to the OSLC provider's requirements endpoint with the specified query string and headers
response = requests.get(requirement_url + oslc_query, headers=headers)

# Check the response status code to see if the request was successful
if response.status_code == 200:
    # If the response was successful, extract the requirement data from the response body (which may be in RDF format)
    requirement_data = response.text
    print('Requirement data: {}'.format(requirement_data))
else:
    # If the response was not successful, raise an exception or handle the error in some other way
    raise Exception('Error retrieving requirement: status code {}'.format(response.status_code))

Note that the exact code required to authenticate with an OSLC provider will depend on the specific authentication mechanism used by the provider. The above code demonstrates how to authenticate using OAuth 2.0, but other mechanisms such as Basic Authentication or client certificates may also be used. Consult the provider’s documentation for details on how to authenticate with their APIs.

ChatGPT Response
Question on authentication (ChatGPT)

As you can see , know context allowed ChatGPT to pull in more details about the Oauth2.0 approach for accessing the API. Having written my own example on my ELM blog, I can tell you, this is a good start to help you understand OSLC.

It’s not 100% there, but it certainly will get you asking the right questions. I believe that is where AI’s will start providing real value to development. Not replacing developers, but helping new developers learn some basics, and letting experience developers focus on the hard problems…For now.