[livesupport-dev] development environment, generated documentation
  • After getting access to the CVS server, I've committed what I've gotten
    to so far. Which is:

    - a description of the development environment
    - scheduler daemon implementation up to only one XML-RPC function


    For the documentation part, I have exported the current state, and put
    it here: http://tyrell.hu/~darkeye/livesupport/doc/ .

    The notable parts are:

    getting started:
    http://tyrell.hu/~darkeye/livesupport/doc/gettingStarted.html

    development environemnt, with file conventions:
    http://tyrell.hu/~darkeye/livesupport/doc/developmentEnvironment/

    generated doxygen documentation for the C++ classes developed so far:
    http://tyrell.hu/~darkeye/livesupport/doc/doxygen/html/

    generated page for the existing tests and their results:
    http://tyrell.hu/~darkeye/livesupport/doc/testResults.html



    please take a look at the above pages. Especially the file conventions -
    I've included generic conventions, and also for the types I use. But it
    would be good to have conventions for other types of files, like PHP
    files as well. And of course changes and remarks are welcome, expecially
    for the HTML file conventions.


    As soon as we would have our development server ready, I'd like to
    achieve that it updates a local copy of the CVS repository about daily,
    and generates documentation based on the current state. this would include:

    - simply checking out current versions of documentation pages
    - generating documentation from the source files, like doxygen
    - generating reports, like the test report above, but also others

    and after getting all these documents based on the ever current version,
    it would upload them some place, where they could be accessed through
    the main livesupport page. thus, these generated pages would be integral
    part of the livesupport web site.

    of course, this would all be automated (not hard to do)


    as for future types of reports, we would have:

    - code coverage reports, which is a metric of the ratio of the source
    code that is being executed by the tests. this is basically a
    quality-measurement
    - metrics generated from the source code itself
    - results of other proofing tools



    PS: Is there a perl guru among is? I've found a perl framework called
    lcov, that generates nice HTML reports of gcov output (which is the GNU
    gcc code coverage tool). Unfortunately, whe I run it, I get a regexp
    error. I've filed a bugreport about it here:
    http://sourceforge.net/tracker/index.php?func=detail&aid=997030&group_id=3382&atid=103382
    but so far there was no reply Sad

    ------------------------------------------
    Posted to Phorum via PhorumMail