QSF XML file backend for gnucash-gnome2-dev branch
linux at codehelp.co.uk
Wed Jan 26 13:22:59 EST 2005
On Wednesday 26 January 2005 4:52 pm, Josh Sled wrote:
> I don't quite understand the "user edited maps". I can see user-edited
> data values, but the applicationA<->applicationB maps don't seem like
> "user-edited" content per-se...?
> But, yeah, you're basically trying to do a [meta-]application task as a
> library. I'd be very thoughtful about the interface that the library
> both presents-to and expects-of the application for the purpose of user-
> and map-file management ... or maybe better: break that piece of the
> puzzle off into a stand-alone "qsf-map-control-console" app.
Yes, a tool will be useful - that's the next stage. User editing is down to
the defaults and, as you recognise, there's little that can be gained from
the user editing the map calculation itself.
> > QOF will do that - by putting the data in XML as a single lump of
> > objects, QOF can query the QofBook read from that XML and do all kinds of
> > SQL-like things.
> Sure. And you can do that with XSLT/XQuery if the XML data is in the
> application-domain, too ... and you can do RDQL/Sparql if the data is in
> RDF ... and straight SQL if it's in a relational DB.
Yes, having the option to separate the data from the application has a host of
> I really do understand what you're trying to do, and understand its
> value. I also think you're re-inventing a /very/ large wheel,
> _especially_ with respect to the mapping stuff. I mean, you've already
> had to create a new mini-programming langauge in the map definition
> format ... why not just use an existing high-level language?
I spent months last year pondering the same thing. My goal has always been
that connection between pilot-link and GnuCash. The library that makes that
connection must be small and with few dependencies to be accepted by
pilot-link. It needs to integrate easily into GnuCash to be accepted here. It
needs to separate the data from the procedures so that data is obtainable in
as wide a range as possible and it needs to allow that interchange with as
few programming changes as possible. I did look at a wide range of options
but nothing gave the results of using QOF - after all, what could provide
better access to GnuCash data than the framework used by the GnuCash engine?
No doubt someone else may have come to another conclusion, but there is no
loss from not using XML-RPC or Lua or OAF or bonobo or the others. In
particular, RPC failed because it could not solve the calculation from my
last message - summing the miles travelled on business for one or more
customers over a specific time period. A specific function would have been
needed for that specific task. Using a generic method makes all sorts of
calculations possible with no extra programming.
> Well, I feel about validation like I do about assertions: great during
> development and debugging, but bad during runtime. Perhaps we can
> arrange things so that if the compilation system has libxml2 >=2.6.0 and
> --enable-debug, then the validation is done. Otherwise not.
I disagree, I prefer validation because there's no reason to implement a
sub-set of validation (just checking Doc-Root) when code exists to check the
whole. After all, you just said to use other existing methods rather than
implementing our own. Schema validation isn't our own, checking just the root
tag would be.
> That identification can very easily happen without the schema; it's just
> a string-comparison against the fully-qualified name of the root element
> of the document, as I said before.
> You only need to look at the fully-qualified name of the root element;
> you can safely assume the rest...
I just don't feel that is sufficient. As the code is now operational, the
library version is satisfactory for the target and the schema is itself
useful for future development, I don't see any point in abandoning it further
down the road. If it wasn't working, I might agree with you.
> > code the majority of the checks that the schema would have done. These
> > QSF objects are user-edited - all manner of garbage could be in them.
> You expect these objects to be _user_ edited?
They can be, yes. They can also be created using XSLT or numerous other tools
to allow importing of formats that GnuCash does not currently support. As you
said, it's meta data.
> I thought the whole point
> was machine-to-machine integration? Import/Export stuff...
We all know XML will be edited when the needs arises, all we do is discourage
people from tampering with the complicated bits, like the map calculation.
When problems arise, we've recommended tweaking GnuCash XML data files before
now. Not ideal or necessarily to be encouraged as routine, but sometimes
there is a place.
e.g. If you want to send some data to a user to try and fix a problem, you
could send hand-edited QSF.
> In any case, I would argue that it is the responsibility of the exporter
> to generate correct XML.
True - I've tried to ensure that QSF is a model exporter.
> If the exporter is a human-being, then an
> editor which supports schema is useful, and publishing the schema in a
> form that's readable for editors to understand and use is a good thing.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Size: 189 bytes
Desc: not available
Url : http://lists.gnucash.org/pipermail/gnucash-devel/attachments/20050126/d2aec3bd/attachment.bin
More information about the gnucash-devel