AI Automators: Observation when using tokens in Advanced (Token)

Created on 14 April 2025, 10 days ago

Problem/Motivation

We have been experimenting with great success but we have noticed an oddity with the use of a user-related token.

We have a pretty detailed LLM Prompt set up along the lines of...

...
You are tasked with marking a student's written answer to the following question…

[node:field_prompt]

You have access to a model answer…

[node:field_model]

You have access to the student's written answer…

[node:field_response]

Compare the student's response to the model answer and assess as follows:

The maximum marks for the whole question is 5, do not award any more than that.
...

But a line using the token: [current-user:field_forename] either don't appear at all, or sometime, we are left with just the original token text [current-user:field_forename] left in place...

The particular line which happens to be toward the end of the prompt is...

If the maximum number of marks has been awarded then add a new paragraph saying "<strong>FULL MARKS</strong>, well done [current-user:field_forename], a great answer!"

It looks like any user-related tokens get ignored, so we are guessing might be intentional but there is nothing to say that we have violated token requests or anything like that in the Watchdog log?

💬 Support request
Status

Active

Version

1.0

Component

AI Automators

Created by

🇬🇧United Kingdom SirClickALot Somerset

Live updates comments and jobs are added and updated live.
Sign in to follow issues

Comments & Activities

  • Issue created by @SirClickALot
  • 🇬🇧United Kingdom MrDaleSmith

    The current user tokens should be automatically turned into their values from the logged in user as part of the token generation process: the most obvious thing to check is that there is a currently logged in user at the point that the AI request is being sent: if it's being trggered by the anonymous user (such as if you are running it via the queued process rather than directly) there will be no value. The other thing to check is that the user triggering the update has that value set on their account.

  • 🇬🇧United Kingdom SirClickALot Somerset

    Thanks @mrdalesmith , exactly what I thought but they are not.

    I understand correctly then they should being interpreted but they are definitely not being.

    All the Node entity-related are being interpreted as you would expect.

    On the face of it, all of this testing is being done by UID:1 since I am logged so permissions ought not to be related however, to give more detail here, I will add that I am triggering the Automator through an ECA using the feature provided by ECA Integration module so I suspect that might well be the issue.

    I can look into that further by using the ECA Switch User feature inside the ECA.

  • 🇬🇧United Kingdom MrDaleSmith

    Yeah I'd recommend forcing ECA to use the root account as an initial test: I'm not 100% familiar with ECA but I wouldn't be surprised if it runs its integrations as anonymous in most cases.

  • 🇮🇳India prashant.c Dharamshala

    @sirclickalot I tried user tokens with Advanced mode, those are working properly. It could be a possibility that your token itself is returning empty/null values. However, I have checked it on 1.1.xversion.

Production build 0.71.5 2024