I am currently working to overcome a limitation that Cohatoe has in its current state: what if your Haskell code makes up more than one module? In this case, you have multiple object files, and of course you must provide all of them to Cohatoe. But you can only declare one of them - so how will the others be available at runtime? (Needless to say, you certainly don't want to have to declare each of them separately, too ;-)
hs-plugins, which is used by Cohatoe on the Haskell side to load the object code, is able to automatically preload module dependencies. Therefore, as long as the module dependencies (that is, their object files) are available to it on some search path, this should be fine. However, the path to the object files to be loaded is determined by Cohatoe only when the object code is registered. They are presumably (but not necessarily, see below) in some subfolder of the
plugins/ folder in the Eclipse installation. Therefore, some more logic is needed in Cohatoe to determine these paths and send them to hs-plugins.
That's not too difficult, but here are some complications:
- an Eclipse plugin may run out of a
- you might want to have several different folders containing object code
- you might want to have a lot of object files and they should be packaged into a single archive file
.jararchives. The object files will then be packed into the archives and can't be automatically found by Cohatoe or hs-plugins. For the
.ofiles that have been declared in the
plugin.xml, and their corresponding
.hifiles, Cohatoe uses a trick before sending their path to hs-plugins: it tells the Eclipse runtime that it needs them as 'local file'. The Eclipse runtime then checks if the plugin is currently running out of a file system folder (which is the case both at debug time when the plugin is physically a plugin project in the workspace, and for deployed plugins when they are deployed with the
unpack="true"flag in the
feature.xmlof the feature that provides the plugin). If so, it gives us the file system location. If not, this means the plugin runs out of a jar file, and then the Eclipse runtime would extract it into a cache location on the file system and give us the path to that location.
Now if we make it a convention in Cohatoe that all object files in the same folder as a declared one are also automatically available to hs-plugins, it is not sufficient to just send the path of the folder containing the declared object files, because that folder is not in all cases the folder which hs-plugins actually uses. We have to make the 'as local' conversion for that folder too, and not just for it, but also for all its contents.
The other two points listed above are actually new features that I could add while I'm at it anyway. hs-plugins provides also a functionality to load libraries of object code.
I think I will add, in one of the next versions of Cohatoe, a possibility to declare such packages, which can be used as dependencies. I'd like to give that some thought, though, because it seems to me that just giving a file list is a somewhat short way to address the problem. If you could say
<objectcodepackage path="..." />
that would solve the problem, but it would force you to provide the binaries for the package in every Eclipse plugin that uses them. I would be interested in having some mechanism to address this, so that you can contribute a package independently, and just refer to it. It would then look like this:
The advantage of this would of course be that any
<depends id="myLib" />
<objectcodepackage id="myLib" path="..." />
haskellFunctiondeclaration could just declare a package as a dependency by its id, even if it was actually provided by a different Eclipse plugin.