Use appsembler/tahoe-lti with Tutor

Hi all,

I am running Open edX with Tutor and I would like to use the Tahoe Customizations for the LTI Consumer Xblock. I tried to install it as an XBlock as explained on the Tutor documentation:

echo "git+https://github.com/appsembler/tahoe-lti.git" >> "$(tutor config printroot)/env/build/openedx/requirements/private.txt"

On the Github page, it is mentioned that the following settings should be added to server-vars.yml “or whatever method you configure your Open edX installation”:

EDXAPP_XBLOCK_SETTINGS:
  lti_consumer:
    parameter_processors:
      - 'tahoe_lti.processors:basic_user_info'
      - 'tahoe_lti.processors:personal_user_info'
      - 'tahoe_lti.processors:cohort_info'
      - 'tahoe_lti.processors:team_info'

I added these settings to "$(tutor config printroot)/config.yml" but it does not work (I tested using LTI Tool Provider emulator by checking “Message Parameters”). I am not sure it is the right place to add the settings but I do not know where to put it otherwise.

Any suggestions?

Thank you in advance for your help!

1 Like

Hi @RonanFR! You’re right that adding these settings to the Tutor settings will not work for you. What you need to do is to create a Tutor plugin (Plugins — Tutor documentation) that will include the following “openedx-common-settings” patch:

XBLOCK_SETTINGS["lti_consumer"] = {
    "parameter_processors": [
      "tahoe_lti.processors:basic_user_info",
      "tahoe_lti.processors:personal_user_info",
      "tahoe_lti.processors:cohort_info",
      "tahoe_lti.processors:team_info",
   ]
}

Hi @regis and thanks a lot for your quick reply!
I followed the steps described in the documentation to create a python plugin: I installed cookiecutter and generated the default empty tutor plugin, then I added a file openedx-common-settings to the tutor-myplugin/tutormyplugin/patches/ directory in which I copy-pasted the code that you gave in your message. But even after running tutor config save and restarting tutor these parameters do not seem to be taken into account according to LTI Tool Provider emulator. I tried to replace XBLOCK_SETTINGS by EDXAPP_XBLOCK_SETTINGS but it did not work either. I am started to wonder whether the Tahoe Customizations can be used as an XBlock since it is an extension of LTI Consumer Xblock (maybe that would explain why the plugin does not seem to take effect). Any advice on how I could check if Tahoe customizations are indeed “installed”?

Also, something unexpected happened after I installed the cookiecutter tutor plugin: before installation the command tutor plugins list would output

discovery==11.0.1 (disabled)
ecommerce==11.0.0 (disabled)
license==11.0.0 (disabled)
minio==11.0.0 (disabled)
notes==11.0.0 (disabled)
xqueue==11.0.1 (disabled)
disableaccountcreation==0.1.0

while now it only outputs

myplugin==0.1.0
disableaccountcreation==0.1.0

Only a custom yaml plugin I made disableaccountcreation did not disappear from the list (the other plugins were installed with tutor if I recall correctly, but disabled by default).

First things first: before you start to wonder whether the addon works, you need to make sure that it is correctly installed. I can help with the installation, but I have no idea whether the Tahoe customizations work, as I did not work with them before.

This is almost certainly because you first installed the tutor binary, which ships with all free and officially supported plugins, but then pip-installed tutor on top. You need to figure out which one you are using (with which tutor for instance). If you want to use custom python plugins, then you want to use the pip-installed one.

1 Like

Thanks a lot for your reply.

The command which tutor outputs /home/ubuntu/.local/bin/tutor so it seems to be using the binary installation. pip freeze does show tutor-openedx==11.2.3 so tutor also seems to be installed with pip (I had forgotten I tried to install it with pip at some point but I know I eventually used the binary installation). So I guess that would explain why I cannot make my plugin work (but what is weird is that the plugin does appear in tutor plugins list).

Would there be a way to create an equivalent YAML plugin instead of using python in this context?

Yes, I see no reason why creating a YAML plugin would not be possible instead. YAML plugins are supported by binary tutor distributions. Python plugins are best when you need to make changes to a large number of files, or need to define new templates.

I tried to create an equivalent YAML plugin as follows:

name: tahoelticonfig
version: 0.1.0
patches:
          openedx-common-settings: |
                      "XBLOCK_SETTINGS":
                              "lti_consumer":
                                      "parameter_processors":
                                              - 'tahoe_lti.processors:basic_user_info'
                                              - 'tahoe_lti.processors:personal_user_info'
                                              - 'tahoe_lti.processors:cohort_info'
                                              - 'tahoe_lti.processors:team_info'

But whenever I enable the plugin tutor plugins enable tahoelticonfig, save the config tutor config save and then restart tutor local restart I obtain an “Internal server Error” when trying to connect to the platform (both LMS and CMS). The error disappears whenever I disable the plugin and restart tutor.

I am not sure that the formatting of the YAML file is correct (the examples provided on the Tutor website do not contain “nested” parameters). Otherwise maybe the addon is not correctly installed as you suggested @regis, how could I check whether the installation is OK (I installed by runnin the following command: echo "git+https://github.com/appsembler/tahoe-lti.git" >> "$(tutor config printroot)/env/build/openedx/requirements/private.txt")

After more careful check, it seems that I indeed (unintentionally) switched from the binary installation to the pip installation at some point as you suggested @regis (which explains why the list of plugins changed).

Just to be sure, I installed Tutor on a different machine using pip only (so that I am sure I can use python plugins) and I installed the Tahoe LTI Customizations and added a custom python plugin with the “openedx-common-settings” patch:

XBLOCK_SETTINGS["lti_consumer"] = {
    "parameter_processors": [
      "tahoe_lti.processors:basic_user_info",
      "tahoe_lti.processors:personal_user_info",
      "tahoe_lti.processors:cohort_info",
      "tahoe_lti.processors:team_info",
   ]
}

And it seems not to be taken into account by EDX (no additional parameters are sent).

So overall when I use a YAML plugin, I obtain an Internal Error (probably because the formatting of the YAML file is wrong but I do not really know how to correct it), and if I use a python plugin I do not obtain any error but the LTI Consumer XBlock just works the same as if no Tahoe LTI Customizations were present.

I investigated a little bit to try to identify where the problem was coming from:

  1. I executed the command pip3 freeze in both container tutor_local_lms_1 and tutor_local_cms_1 and it seems Tahoe LTI Customizations are correcty installed since it displays tahoe-lti==0.3.0.

  2. Then I looked at the configuration files of the LMS and the CMS:

  • In container tutor_local_lms_1 the file /openedx/edx-platform/lms/devstack.yml contains a field XBLOCK_SETTINGS: {} but it is empty.
  • In container tutor_local_cms_1 the file /openedx/edx-platform/cms/devstack.yml contains a field XBLOCK_SETTINGS: {} but it is empty.

So based on these observations I would say that the problem is coming from the python plugin which is not correctly modifying the devstack.yml. @regis do you confirm I am looking at the correct yaml file?

I manually changed the XBLOCK_SETTINGS in the devstack.yml files of containers tutor_local_lms_1 and tutor_local_cms_1 as follow:

XBLOCK_SETTINGS:
  lti_consumer:
    parameter_processors:
      - 'tahoe_lti.processors:basic_user_info'
      - 'tahoe_lti.processors:personal_user_info'
      - 'tahoe_lti.processors:cohort_info'
      - 'tahoe_lti.processors:team_info'

I then restarted tutor but id didn’t seem to have any impact.

You should ignore the devstack.yml file, as it is unused in Tutor.

You should format the “openedx-common-settings” patch as valid Python code, as it will eventually be rendered in Python settings file. Your patch here is actually YAML formatting, which is almost certainly what is causing the internal server error.

On a side note, whenever you face an error, you should check the container logs: Troubleshooting — Tutor documentation

1 Like

Thanks for the clarification on the devstack.yml file.
Ok I see! Since it is was YAML file I thought it would require YAML formatting. I tried with python formatting and indeed there is no error.
Actually I realised that both the YAML and python plugin work correctly since I noticed that the file /home/ubuntu/.local/share/tutor/env/apps/openedx/settings/lms/production.py was updated with the code

XBLOCK_SETTINGS["lti_consumer"] = {
    "parameter_processors": [
      "tahoe_lti.processors:basic_user_info",
      "tahoe_lti.processors:personal_user_info",
      "tahoe_lti.processors:cohort_info",
      "tahoe_lti.processors:team_info",
   ]
}

when any of the 2 plugins (python or yaml) is enabled. I also noticed that the same piece of code is present in the files /openedx/edx-platform/lms/envs/tutor/production.py and /openedx/edx-platform/cms/envs/tutor/production.py of both contianer tutor_local_lms_1 and tutor_local_cms_1 when any of the two plugins (python or yaml) is enabled. So in the end I think both the Xblock installation of Tahoe LTI Customizations and the plugin installation work (python or yaml, same effect). But it does not seem to have any effect on the parameters that are sent by edX to the LTI tool (according to LTI Tool Provider emulator) which is the purpose of Tahoe Customizations (in particular sending the user data to obtain the name of the user). I do not know where the problem is coming from but I am thinking that it has nothing to do with Tutor so I will try to post the issue on the edX forum.

I discovered that the files openedx/edx-platform/lms/envs/common.py and openedx/edx-platform/cms/envs/common.py in both container tutor_local_lms_1 and tutor_local_cms_1 had XBLOCK_SETTINGS = {} even after activating the plugin, but even after manually updating this file and restarting the platform nothing changed. I am not sure in which file the XBLOCK_SETTINGS is supposed to be set in the end.

The settings from common.py are overridden by tutor/production.py, so I wouldn’t worry about it. To verify this, you can simply open a shell:

tutor local run lms ./manage.py lms shell
from django.conf import settings
print(settings.XBLOCK_SETTINGS)
1 Like

Thanks @regis. I tried the commands you gave and it seems to work fine, the print(settings.XBLOCK_SETTINGS) displays

{'VideoBlock': {'licensing_enabled': False, 'YOUTUBE_API_KEY': 'PUT_YOUR_API_KEY_HERE'}, 'lti_consumer': {'parameter_processors': ['tahoe_lti.processors:basic_user_info', 'tahoe_lti.processors:personal_user_info', 'tahoe_lti.processors:cohort_info', 'tahoe_lti.processors:team_info']}}

I also checked the tahoe LTI installation and everything seems fine:

from tahoe_lti.processors import personal_user_info, cohort_info, team_info, basic_user_info

So it seems unlikely that the problem is related to Tutor, I guess something is wrong with the Xblock.

I have tried to identify where the problem could be coming from.

I found that the parameter_processors are only used in the function get_parameter_processors of the file edx/xblock-lti-consumer/lti-consumer/lti_xblock.py. The function get_parameter_processors checks the value of of XBLOCK_SETTINGS["lti_consumer"]["parameter_processors"] which in my case is:

['tahoe_lti.processors:basic_user_info', 
'tahoe_lti.processors:personal_user_info', 
'tahoe_lti.processors:cohort_info', 'tahoe_lti.processors:team_info']

The function then identifies the python modules (characters before the symbol “:” e.g., tahoe_lti.processors) and the functions to import from these modules (characters after the symbol “:” e.g., basic_user_info). Here is the code (quite simple):

   def get_parameter_processors(self):
        """
        Read the parameter processor functions from the settings and return their functions.
        """
        if not self.enable_processors:
            return

        try:
            for path in self.get_settings().get('parameter_processors', []):
                module_path, func_name = path.split(':', 1)
                module = import_module(module_path)
                yield getattr(module, func_name)
        except Exception:
            log.exception('Something went wrong in reading the LTI XBlock configuration.')
            raise

All the functions of the tahoe_lti.processors module (such as basic_user_info, etc…) just return a dictionary mapping keys (strings) to values (strings). If my understanding is correct, the idea is that all these key-value pairs should be send by edX to the LTI Tool. So get_parameter_processors only returns a generator of functions that return dictionaries mapping strings to other strings.

On my Tutor installation, the file lti_xblock.py can be found on 8 different paths:

tutor_local_cms_1:/openedx/venv/lib64/python3.8/site-packages/lti_consumer/
tutor_local_lms_1:/openedx/venv/lib64/python3.8/site-packages/lti_consumer/
tutor_local_cms_1:/openedx/venv/lib/python3.8/site-packages/lti_consumer/
tutor_local_lms_1:/openedx/venv/lib/python3.8/site-packages/lti_consumer/
tutor_local_lms-worker_1:/openedx/venv/lib64/python3.8/site-packages/lti_consumer/
tutor_local_lms-worker_1:/openedx/venv/lib/python3.8/site-packages/lti_consumer/
tutor_local_cms-worker_1:/openedx/venv/lib64/python3.8/site-packages/lti_consumer/
tutor_local_cms-worker_1:/openedx/venv/lib/python3.8/site-packages/lti_consumer/

According to the diff command, all the files are exactly the same.
I then tried to modify the get_parameter_processors of all 8 files as follows:

def get_parameter_processors(self):
        """
        Read the parameter processor functions from the settings and return their functions.
        """
        def test(xblock):
            return {'ronan': 'blabla', 'test': 'ronan'}
        for i in range(1):
            yield test

With this new code I was expecting edX to send the key-value pairs ronan=blabla and test=ronan independently of XBLOCK_SETTINGS. But when I check the LTI Tool Provider Emulator none of these parameters are sent (see picture below) even after stopping and restarting Tutor (my changes are still there after restart though). I also tried with {'lis_person_full_name': 'ronan'} instead of {'ronan': 'blabla', 'test': 'ronan'} (just to be sure that unexpected keys are not filtered later in the code, ‘lis_person_full_name’ is part of the LTI nomenclature).

@regis are there other files I should modify for my changes to be taken into account? if not, do you have any idea why my modifications seem to have no effect?

Your issue no longer seems related to Tutor, so at this stage I think your best bet is to get in touch with the original xblock authors (aka: Appsembler).

All the modifications I described in my messages only involved the python files of the LTI Xblock of the edx platform, not the additional Appsembler Xblock. I tried to check if by modifying the edx scripts of the LTI XBlock implemented in edX I could observe some change in behaviour of the LTI Tool Emulator and I have failed (and it is not clear to me why). But you are right that it is not related to Tutor, apologies if it is polluting the forum. If I find a solution, I will put it here though if it’s Ok so that people facing the same issue can overcome it.

1 Like

After further investigation I managed to identify that the problem was coming from a bug in the LTI Consumer XBlock code, namely the method get_parameter_processors in charge of handling additional LTI parameters is never called anywhere in the code. I then discovered that an issue was opened on Github 15 days ago with a fix to this problem: [SE-4235] Add missing parameter processors code by pkulkark · Pull Request #150 · edx/xblock-lti-consumer · GitHub

1 Like