sites back --Simon Michael, Fri, 02 Jan 2009 11:27:59 -0800 reply

On Jan 2, 2009, at 5:27 AM, Kent Tenney wrote: > IRC tells me you are on vacation, hope you're chillin' hard.

> Hate to bring you problems, but I'm curious about the status > of zwiki.org, is there an ETA for it's return?

Thanks Kent. All sites should be back now. I fixed an outage over christmas (apache consuming all memory), but maybe forgot to test zope sites ???! That's pretty poor sysadmin from me, my apologies all.

Vacation was terrific. I'm just back from San Francisco, Ireland and a meditation retreat here in California. Happy new year to all of us.

Test topic --Adriano Santoni, Sun, 25 Jan 2009 23:10:54 -0800 reply

Molto interessante...

Test topic --Adriano Santoni, Sun, 25 Jan 2009 23:11:44 -0800 reply

Non è vero, secondo me è noioso...

darcs send --vejeta, Wed, 28 Jan 2009 10:30:34 -0800 reply

Followed the steps in http://zwiki.org/DarcsRepos

I tried to do a darcs send:

darcs send
Creating patch to "http://zwiki.org/repos/ZWiki"...
Wed Jan 28 17:45:24 CET 2009  Myself <edited@edited.com>
  * rating_plone3_1_7.patch
  See open issue for a detailed report.
  http://zwiki.org/1425ErrorWhenRatingAWikiPageInPlone317
Shall I send this patch? (1/1)  [ynWsfvpxdaqjk], or ? for help: y
What is the target email address? zwiki@zwiki.org
Successfully sent patch bundle to: zwiki@zwiki.org.

Was this the right way? I didnt see it posted in GeneralDiscussion page, after doing it.

darcs send --Simon Michael, Wed, 28 Jan 2009 14:26:53 -0800 reply

I think you did everything right, I don't know why the patch didn't appear on GD; nor why darcs won't accept the patch (which you re-sent directly to me). I'll fix these things and report back. Thanks for the patch!

some testimonials for home page ? --Simon Michael, Wed, 28 Jan 2009 14:59:54 -0800 reply

Marketing.. sure, why not. New Zwiki users are still showing up and a little more marketing wouldn't hurt.

I think some of us enjoy using Zwiki and I've thought about collecting some testimonials/quotes. I know when I see a project that has realistic testimonials from happy users, that really helps motivate me to try it out.

I don't have much sense of how large/active/plugged in to this page the Zwiki-using community is, but if you are reading this, are happy with the software and can write a short testimonial, please send it. Don't wait, just hit reply now! :)

Testimonials are like a micro-review, typically a short quote. (Longer reviews are of course welcome too!) They should come from your real experience with Zwiki, and be honest (negative reviews are also useful.) If I can collect enough positive ones, I'll start rotating them on the zwiki.org home page.

some testimonials for home page ? --Simon Michael, Wed, 28 Jan 2009 15:03:49 -0800 reply

On Jan 28, 2009, at 3:00 PM, Simon Michael wrote: > Testimonials are like a micro-review, typically a short quote. (Longer

PS include the name you'd like to appear with the quote; the strongest testimonials include a real full name and organization.

some testimonials for home page ? --jmax, Wed, 28 Jan 2009 15:18:51 -0800 reply

We've been using ZWiki as our general purpose web platform for five years now; its brilliant blend of simplicity and malleability means we've been able to extend it in a whole bunch of directions: reference library, blogging platform, discussion forum, classroom support, writing environment, project tracker, and all-around web content management tool. We have redesigned our ZWiki-based site from top-to-bottom four times in five years, and I remain convinced that ZWiki is the most flexible and capable tool around. - John Maxwell, Canadian Centre for Studies in Publishing, SFU. http://thinkubator.ccsp.sfu.ca/AboutThinkubator

darcs send --Simon Michael, Wed, 28 Jan 2009 20:29:31 -0800 reply

Hi vejeta/zoperman, an update on this:

> I don't know why the patch didn't appear on GD

I haven't looked into this, but I know it wasn't working well (darcs patches made a mess of the wiki page) so I have changed the repo so darcs send forwards patches to me. I'll still use GeneralDiscussion for discussing patches. http://zwiki.org/DarcsRepos has been updated.

> ; nor why darcs won't accept the patch

This seems to be a darcs bug, reported as http://bugs.darcs.net/issue1335

I was able to work around the above, but the patch still does not apply in the stable repo; it says there are conflicts, and if I try to darcs apply --mark-conflicts darcs gives different error. Looks like another darcs bug or something wrong with the repo. Could you try doing a darcs pull in your repo to make sure you have the latest, and then do "darcs send" again.

Next, a little patch review: I see you've added the uid keyword argument to the catalog_object call in rating.py. I expect this was needed for your version of Plone (what version do you have ?) Do you know if this will work with other Plone versions ? Zwiki 0.61 is supposed to work with all versions of Plone 3, and I think Plone 2.5. It would be good to hear the code was tested with these versions, or that you researched a bit and think the risk is low.

This is a lot of work for a one-line fix, so if it's too much hassle don't worry about it. Your patch may also be highlighting some changes we need to make in the process, which is also valuable.

some testimonials for home page ? --betabug, Wed, 28 Jan 2009 22:00:32 -0800 reply

One of my cool experiences with Zwiki: I created a site for my father and I almost never have support questions from him - he's just updating that site all on his own. He'll be 65 now - not stone old, but still no web hip youngster any more.

darcs send --Juan M. Mendez, Thu, 29 Jan 2009 05:30:07 -0800 reply

2009/1/29 Simon Michael <zwiki@zwiki.org>: > Hi vejeta/zoperman, an update on this: > >> I don't know why the patch didn't appear on GD > Could you try > doing a darcs pull in your repo to make sure you have the latest, and > then do "darcs send" again.

I did

$ darcs pull

Pulling from "http://zwiki.org/repos/ZWiki"... Thu Jan 29 04:08:55 CET 2009 Simon Michael <simon@joyful.com>

tagged release-0-61-0

Shall I pull this patch? (1/1) [ynWvpxdaqjk]?, or ? for help: y Finished pulling and applying.

$ darcs send --to=simon@joyful.com"

Creating patch to "http://zwiki.org/repos/ZWiki"... Wed Jan 28 17:45:24 CET 2009 Juan M. Méndez <vejeta@gmail.com>

  • rating_plone3_1_7.patch

See open issue for a detailed report. http://zwiki.org/1425ErrorWhenRatingAWikiPageInPlone317

Shall I send this patch? (1/1) [ynWsfvpxdaqjk]?, or ? for help: y Successfully sent patch bundle to: simon@joyful.com.

> Zwiki 0.61 is > supposed to work with all versions of Plone 3, and I think Plone 2.5. > It would be good to hear the code was tested with these versions, or > that you researched a bit and think the risk is low.

I have tested it with Zope2.10 and Plone 3.1.7. I've created an instance in my dev machine and downloaded Plone 2.5.3. As soon as I finish the setup, I'll report what I got.

> This is a lot of work for a one-line fix, so if it's too much hassle > don't worry about it. Your patch may also be highlighting some changes > we need to make in the process, which is also valuable.

I'm glad to collaborate, the least I could do for having Zwiki, Zope and Plone.

darcs send --Simon Michael, Thu, 29 Jan 2009 10:04:46 -0800 reply

FYI, that one was better - no conflicts. I recorded the 0-61-0 tag yesterday since the old tag had a problem, I guess you needed that.

announcing new repos, Zwiki 1 and 2 --simon, Fri, 30 Jan 2009 01:33:38 -0800 reply

I have repurposed the Zwiki stable and unstable repos and updated FrontPage and CodeRepos. We now have

I hope this will simplify things and help us create some more cool stuff!

drop some page types from Zwiki 2 ? --Simon Michael, Tue, 03 Feb 2009 16:29:15 -0800 reply

How about dropping the moin page type from Zwiki 2 ?

I ask because I've just partially fixed unicode-related bugs in the old moin code we ship with, but I know there are more in there.

The moin support was sponsored by Canonical in the days when they were switching from moin to zwiki, I think. However I don't think it's used by anyone nowadays. Also I don't think it helps people migrating between the two wiki engines all that much. It does allow you to merge moin content into a zwiki and convert the pages gradually by hand. But it doesn't help with the actual markup conversion. And our moin markup is probably not a desirable everyday markup, as it's not 100% like MoinMoin (no macros), it lags behind rst/stx in terms of Zwiki support, and it's less intuitive for users than those two imho. So I suggest we drop it from Zwiki 2 to free up time for other parts of the code.

Ditto for the wwml page type - I suggest we drop this one too, simply because no zwiki users use it.

Comments ? If you really need either of these page types, this is your chance to make your case.

drop some page types from Zwiki 2 ? --Simon Michael, Tue, 03 Feb 2009 16:40:00 -0800 reply

Continuing down this path, I guess I would propose to drop the LaTeX?- and MathAction-supporting page types also, because they are not used enough to justify spending maintenance cycles on them, and LaTeX? at least is better served by other wiki engines now.

..well, I guess there is one big user still, the Axiom project (http://axiom-wiki.newsynthesis.org ). But they can still run Zwiki 1, and if interested can maintain MathAction as a contributed plugin again for Zwiki 2.

So where previously I bundled some plugins, to maintain them better and provide those features for all, I now seem to be advocating unbundling them, to focus core developers' efforts on things which serve the most users.

posted on slashdot --Simon Michael, Thu, 19 Feb 2009 15:49:40 -0800 reply

FYI, I've just posted to How Do You Document Technical Procedures ? (http://ask.slashdot.org/article.pl?sid=09/02/19/1631231 ) on Slashdot, pushing Zwiki and FreeHosting. Surprisingly few wiki advocates have appeared there, at least in the high-rated replies I read.

My comment is http://ask.slashdot.org/comments.pl?sid=1133925&cid=26923577 , feel free to add your thoughts or mod up!

frontpage --frank, Sat, 28 Feb 2009 13:53:48 -0800 reply

Hi Simon, you've been quite active lately... keep up the good work!

With all the changes I wanted to make one remark, FrontPage doesn't show correctly in any version of Internet Explorer I've tried, not IE6, IE7, or IE8 (with or without compatibility mode. I tried http://browsershots.org to find out the differences between browsers but it chokes on zwiki.org's robots.txt.

frontpage --Simon Michael, Sun, 01 Mar 2009 09:58:31 -0800 reply

Thanks for the encouragement Frank! Good to know it's apparent.

> With all the changes I wanted to make one remark, FrontPage doesn't > show correctly in any version of Internet Explorer I've tried, not > IE6, IE7, or IE8 (with or without compatibility mode. I tried http://browsershots.org > to find out the differences between browsers but it chokes on > zwiki.org's robots.txt.

Hmm, very good to be aware of. The list of fix and polish issues is growing long! All help welcome. browsershots.org looks really handy.

robots.txt was permitting only google and internet archive for a long time - a clumsy solution to load problems caused by rude bots. Microsoft has been periodically asking me to open up robots.txt for them. I've gone ahead and opened it up to all, we'll see what happens.

Re: zope/plone article --Simon Michael, Sun, 01 Mar 2009 10:25:16 -0800 reply

Hi Niall.. thanks for the feedback, it is useful. I'll take this to zwiki.org if you don't mind.

> I'd personally absolutely love if the page URL showed the ZWiki page > hierarchy. Much more SEO friendly in particular.

Ah, that's how most sites do it but I really dislike it (overall). It becomes much less attractive to rearrange your hierarchy, because your links break. And it's much harder to remember and guess urls for pages. Why are deep urls more SEO friendly ?

> No way ZWiki could tag which bits are comments and automatically > separate comments from content when asked?

It knows internally - try SomePage?/documentPart and SomePage?/ discussionPart. What's the problem you're having ? If you'd like to have a skin or skin option that displays the comments part differently, eg makes them collapsible, I agree. They do have their own css style class/id I believe. Note you can also disable commenting (deny Zwiki: add comments permission). Or with some skin tweaks you can enable commenting only on *Talk pages, and provide a standard link to those (mediawiki-style).

>>> in particular in my case my restructured
>>> text won't let me insert raw html for some odd reason.
>>
>> http://zwiki.org/1342 discusses this. Summary: get the latest Zwiki 2
>> from darcs and set the rst_raw_enabled folder property to true to get
>> it working.
>
> You solved my problem there - thank you. I had thought I was going
> mad.
>
> I'll tell you why I wanted the dtml ... I am trying to create a master
> include page which defines a long list of external links to sites
> all over
> the internet. I then want to include that master include page into
> each and
> every Zwiki page such that they can then refer to the links without
> having
> to define them in each and every page. My intent is that updating
> masterinclude therefore updates all Zwiki links at once.
>
> This is effectively http://zwiki.org/IncludeOrTransclude. I had
> originally
> hoped that restructured text's include:: directive would work but
> unfortunately it has been disabled. I then got the dtml working via
> <dtml-
> var "include('Masterinclude')">, but unfortunately this appears to
> run the
> REST processor per included file and therefore the links don't
> transfer.
>
> Is there any way for me to accomplish what I want?

For sure. You can call any of Zwiki's public methods, so eg dtml-var "pageWithName('Masterinclude').read()" would get just the text without any formatting or wiki-linking. But maybe http://zwiki.org/RemoteWikiLinks will do what you want ?

store revisions with each page ? --Simon Michael, Sun, 01 Mar 2009 12:17:44 -0800 reply

Willi Langenberger wrote: > I'm thinking about versioning. Making a ZWikiPage folderish and store > the versions in it. BTW, Silva does it that way and i like it ;-)

Anyone else got arguments for or against this ? Currently all revisions are stored in the revisions/ subfolder. Advantages of this: you can easily see how many have built up, and clean out, in one place. Advantages of storing within the (folderish) pages: moving a page to another wiki would preserve its history, deleting a page would automatically delete the history, maybe saving page renames in the history would be easier to implement..

ZWikiPage Folderish +1 --jmax, Sun, 01 Mar 2009 13:41:53 -0800 reply

YES PLEASE! This would make ZWiki hugely more valuable as a general purpose CMS -- imagine storing attachments, revisions, etc locally to a page object, rather than out in the open namespace. This is my #1 feature request for ZWiki. BTW, my #2 request, making comments more OO-like, could be solved by this same thing: make comments children of (or contained by) the page. Two birds, one stone.

ZWikiPage Folderish +1 --Simon Michael, Mon, 02 Mar 2009 08:12:39 -0800 reply

jmax - thanks, strong approval noted!

> CMS -- imagine storing attachments, revisions, etc locally to a page > object, rather than out in the open namespace.

ZWikiPage Folderish +2 --EmmaLaurijssens, Mon, 02 Mar 2009 12:49:51 -0800 reply

But you already knew that, didn't you?

ZWikiPage Folderish +2 --Simon Michael, Mon, 02 Mar 2009 13:14:41 -0800 reply

On Mar 2, 2009, at 12:50 PM, EmmaLaurijssens wrote: > +2 But you already knew that, didn't you?

No, I have forgotten who likes what. It's like starting over. :)

> * It would be ideal to create subpages, like the tabs you have on a > MediaWiki? wiki.

Why ? (If I'm being dense, don't hesitate to spell it out as if for a child. :)

> * If you have lots of images referenced throughout the wiki you'd > still have the choice to put them in an images folder (or even an > images page, although that would take a loooong time to load) or > leave it the way it is now.

You're right that storing files per-page doesn't mean we can't still allow storing them in the folder. This is not an argument for doing it though.

> * having special subpages/subfolders like images or could be quite > powerful, especially if they can be inherited from somewhere

Perhaps, perhaps, but we need an example or two otherwise I invoke http://c2.com/cgi/wiki?YouArentGonnaNeedIt .

For clarity let me just note this thread (about storing file attachments under pages, which I'm not convinced about yet) is orthogonal to the original post (about storing revisions under pages, which I am leaning towards though Willi hasn't argued it yet).

ZWikiPage Folderish +2 --jmax, Mon, 02 Mar 2009 14:27:19 -0800 reply

I would like to see revision tracking on attachments... not necessarily revisions themseves, but a history of attachment changes. Does that help tie the two issues together?

Here's the scenario: Say a particular page talks about a particular image. It makes sense for the image to have a strong tie to the page (that is, not like a site-wide logo). Furthermore, if the image (which is a constituent piece of the page content) changes -- it gets updated or re-uploaded -- it would be great if the page were aware of that.

You asked:

how often do you need to upload two attachments with the same name ? Conversely, how often might you want to link to the same attachment from two or more pages ?

Same image on two pages? Never, in my experience.

But the naming issue is to my mind part of the whole global-namespace issue (the wiki double-edged sword). Having attachments, sub-pages, etc. in a local context would be great. We often run lots of sub-projects within a wiki; namespace collisions do occur.

Would you want those file/image links to break when the owner page is renamed or deleted ?

I think the advantages of locality outweigh this. I'm more interested in having the image local to the page, not just for referencing, but for tracking, etc.

Wouldn't you want to be able to quickly see all files/images in your wiki by going to folder contents and sorting by type ? Or if you chose, organise them into logical folders ("images/", "css/") ? Or be able to upload your static website's file tree into zwiki and have it just work ?

Personally, I haven't wanted to do any of these things in 5 years of using ZWiki.

Your appeal to WikiWikiWeb:YouArentGonnaNeedIt is a good one. I'm always thinking of ways to stretch ZWiki into heavier CMS duties. Folderish objects are definitely on the right path there. But at the expense of simplicity.

ZWikiPage Folderish +2 --betabug, Tue, 03 Mar 2009 00:42:12 -0800 reply

jmax wrote:

> Same image on two pages? Never, in my experience

Click on any image in the wikipedia, scroll down to the bottom of the page, look at the list under "File links" (e.g. on http://en.wikipedia.org/wiki/File:Vithoba_Gutenberg.jpg - random example from the wikipedia start page). Even when we exclude all the links from user pages, there are quite often multiple uses of the same image.

ZWikiPage Folderish --betabug, Tue, 03 Mar 2009 00:54:17 -0800 reply

That "multiple uses of images" issue aside, I do think folderish ZWikiPages? would be useful. Same as Simon I had thought about storing revisions in there. It would probably make pruning of old revisions more complicated, but apart from that, revision handling would become cleaner and more logical.

Storing "page attachments" or images could be handled inside the folderish page or centrally - the downside is as already mentioned the added complexity.

I think storing sub-pages in there is only a good idea if those are strictly limited uses - I'm not even sure if I'd want discussion pages handled that way. Having the page hierarchy reflected in folderish parent pages and "child pages stored inside" - I think that one is an idea that clearly breaks the clean wiki approach.

Personally for image storing in my wikis, I use a simple image uploader / management object that I wrote myself. Those images are stored centrally, they can be commented on and they offer the ReST image links ready for copy and pasting. Hmm, I really should publish that code.

So, summarizing my oppinion: I'm in favor of folderish ZWikiPage for revision storing, maybe for image storing in some cases, but clearly not for using it to build page hierarchies.

Re: zope/plone article --Niall Douglas, Tue, 03 Mar 2009 01:44:58 -0800 reply

It would appear that it ignores replies unless subscribed ...

------- Forwarded message follows ------- From: Niall Douglas <s_sourceforge@nedprod.com> To: zwiki@zwiki.org Subject: Re: GeneralDiscussion Re: zope/plone article Date sent: Mon, 02 Mar 2009 14:07:07 -0000

On 1 Mar 2009 at 10:25, Simon Michael wrote:

> Hi Niall.. thanks for the feedback, it is useful. I'll take this to > zwiki.org if you don't mind.

If you could use the s_sourceforge@ email when on public discussions that would be great. The niall@ is for non public use only (to avoid spam).

> > I'd personally absolutely love if the page URL showed the ZWiki page > > hierarchy. Much more SEO friendly in particular. > > Ah, that's how most sites do it but I really dislike it (overall). It > becomes much less attractive to rearrange your hierarchy, because your > links break. And it's much harder to remember and guess urls for > pages. Why are deep urls more SEO friendly ?

One of the big reasons that I like Plone is that it displays a reasonable full page URL rather than a "?Content=1964" or whatever. I know it's a matter of personal taste, but when you move content within a hierarchy you want the link to break to indicate the new hierarchy. That's what 301 response codes are for - to indicate permanently moved content.

My preference isn't so much of an issue on a Zwiki-only site but rather when running from within Plone - one has a clearly obvious Plone hierarchy in the navigation pane and Plone follows a hierarchy so one is used to it. There is a lot going on in the default Plone view and I find that the Zwiki hierarchy information gets "lost" within all the detail. Therefore, I find myself glancing at the URL hoping for some clues and not a lot is to be found.

> > I'll tell you why I wanted the dtml ... I am trying to create a master > > include page which defines a long list of external links to sites > > all over > > the internet. I then want to include that master include page into > > each and > > every Zwiki page such that they can then refer to the links without > > having > > to define them in each and every page. My intent is that updating > > masterinclude therefore updates all Zwiki links at once. > > > > This is effectively http://zwiki.org/IncludeOrTransclude. I had > > originally > > hoped that restructured text's include:: directive would work but > > unfortunately it has been disabled. I then got the dtml working via > > <dtml- > > var "include('Masterinclude')">, but unfortunately this appears to > > run the > > REST processor per included file and therefore the links don't > > transfer. > > > > Is there any way for me to accomplish what I want? > > For sure. You can call any of Zwiki's public methods, so eg dtml-var > "pageWithName('Masterinclude').read()" would get just the text without > any formatting or wiki-linking. But maybe http://zwiki.org/RemoteWikiLinks > will do what you want ?

No, I find RemoteWikiLinks ugly because they require a prefix before every remote wiki link. I tried embedding the remotewikilink into a RST link but they aren't parsed.

I have been ploughing through the Zwiki source and page rendition certainly looks convoluted (mostly due to Plone from what I gather). /portal_skins/zwiki/content.pt seems to invoke <div tal:replace="structure python:here.talsafe(options['body']?)"> which as far as I can tell instructs Plone to go fetch the body of the page via main_template.

This eventually ends back up in ZWikiPage.__call__() which calls self.preRender() which does:

def preRender(self, page, text=None):
t = text or (page.document()+'n'+MIDSECTIONMARKER+
self.preRenderMessages(page))

t = page.applyWikiLinkLineEscapesIn(t) t = self.format(t) t = page.markLinksIn(t,urls=0) t = self.protectEmailAddresses(page,t) return t

AFAICS, dtml is passed through untouched into the prerender output and is then executed separately i.e. AFTER the PageTypeRst? processing stage. Therefore, if I include one RST file into another via dtml, they are processed separately and therefore links in one cannot be used by the

other.

Given this, the only way I can see round the problem is to have every page have no content except for a dtml directive which then glues together the RST text from another two separate pages and spits out its rendition. This obviously isn't ideal, yet I can't see much option without modifying the source code?

Is there any way to hook oneself into the prerender stage e.g. overload the

PageTypeRst?.format() method?

One simple and easy solution is to hack in a RST options file_insertion_enabled=True somehow. I think though that RST inclusions would then use the filesystem rather than the ZODB yes? That actually

doesn't bother me - how would I insert this change from the ZMI/DTML rather

than modifying source?

Thanks in advance, Niall

------- End of forwarded message -------

Re: zope/plone article --Simon Michael, Tue, 03 Mar 2009 07:55:36 -0800 reply

Morning..

> It would appear that it ignores replies unless subscribed ...

Correct! Sorry, it should return a bounce in this case. That's been on the todo list forever. (On the other hand, zwiki.org now handles many incoming spams per minute, so maybe silently dropping unknown mailins is best in the long run, like qmail..)

>>> I'd personally absolutely love if the page URL showed the ZWiki page
>>> hierarchy. Much more SEO friendly in particular.

I still don't get the SEO argument, but I see you are finding Zwiki's page hierarchy mixed with plone's to be awkward, and I quite agree with that. Two different models jammed together, and each one alone already has enough to deal with; it's bound to be somewhat confusing. I believe this is in the issue tracker, new ideas would be useful there. The simplest thing is to hide and not use zwiki page hierarchy when in plone, ie just be a classic flat wiki by default. A lot of zwiki in plone users do seem to use zwiki page hierarchy though.

> I have been ploughing through the Zwiki source and page rendition > certainly looks convoluted

You're right! Most folks don't trace through the whole process. The skins are a little contorted to achieve goals like: automatically using the plone skin when in plone; providing a simple consistent namespace for skin scripters; being able to customise anything with zodb or filesystem or built-in templates; and being as "live" as possible, ie reacting to template changes immediately at least in debug mode. These three goals were reached but the templates and styles need more cleanup and clarification. I'm working on that slowly.

> AFAICS, dtml is passed through untouched into the prerender output > and is then executed separately i.e. AFTER the PageTypeRst? > processing > stage.

Correct. There are two rendering stages for a zwiki page. Pre- rendering is done when saving an edit. We do as much of the work here as possible. Final rendering is done on each page view; the most dynamic things (resolving wiki links, evaluating DTML) are done here.

Sorry, I ran out of time here. I'll try answer the rest later. I'm sure whatever it is can be done, it's just a matter of finding the right methods in ZWikiPage.py or rst.py.

Re: zope/plone article --Simon Michael, Tue, 03 Mar 2009 08:51:07 -0800 reply

Well I'm not clear on what you're doing, but I'll make a guess: dtml-var "wikilink(ThePage?.read())" ?

Re: zope/plone article --Niall Douglas, Thu, 05 Mar 2009 06:27:52 -0800 reply

On 3 Mar 2009 at 8:51, Simon Michael wrote:

> Well I'm not clear on what you're doing, but I'll make a guess: dtml-var > "wikilink(ThePage?.read())" ?

Do you mean by this that I replace portal_skins/zwiki/content.pt like this:

<div metal:define-macro="content" class="content">
<div tal:content="structure python:here.talsafe(
here.wikilink(pageWithName('Masterinclude').read()+here.read()))">

content goes here..

</div> <!-- <div tal:replace="structure

python:here.talsafe(options['body']?)">
main text, subtopics links, comments

</div> --> <a name="bottom"></a> <br /> <div metal:use-macro="here/macros/commentform" />

</div>

I'll try giving this a go when I next get access to my website for a while.

Niall

Re: zope/plone article --Niall Douglas, Thu, 05 Mar 2009 06:58:40 -0800 reply

This is actually working code for content.pt, but it seems awfully brute force (i.e. slow, CPU intensive and inefficient):

<div metal:define-macro="content" class="content">
<div tal:replace="structure

python:here.talsafe(here.renderText(here.pageWithName('Masterinclude') .read()+'nn'+here.read(),

here.pageTypeId(),REQUEST=request,RESPONSE=request.RESPONSE))">
content goes here..

</div> <!-- <div tal:replace="structure

python:here.talsafe(options['body']?)">
main text, subtopics links, comments

</div> --> <a name="bottom"></a> <br /> <div metal:use-macro="here/macros/commentform" />

</div>

Is there not surely a much better way of implementing this?

Cheers, Niall

On 5 Mar 2009 at 6:28, Niall Douglas wrote:

> On 3 Mar 2009 at 8:51, Simon Michael wrote: > > > Well I'm not clear on what you're doing, but I'll make a guess: dtml-var > > "wikilink(ThePage?.read())" ? > > Do you mean by this that I replace portal_skins/zwiki/content.pt like > this: > > <div metal:define-macro="content" class="content"> > <div tal:content="structure python:here.talsafe( > here.wikilink(pageWithName('Masterinclude').read()+here.read()))"> > content goes here.. > </div> > <!-- <div tal:replace="structure > python:here.talsafe(options['body']?)"> > main text, subtopics links, comments > </div> --> > <a name="bottom"></a> > <br /> > <div metal:use-macro="here/macros/commentform" /> > </div> > > I'll try giving this a go when I next get access to my website for a > while. > > Niall > > > -- > forwarded from http://zwiki.org/GeneralDiscussion#msg49AFE165.25316.46AFDC0@s_sourceforge.nedprod.com

Re: zope/plone article --Simon Michael, Thu, 05 Mar 2009 09:26:42 -0800 reply

Yow.. well if it works, great. :)

Can you describe briefly the end goal again ?

Re: zope/plone article --Niall Douglas, Fri, 06 Mar 2009 06:24:30 -0800 reply

On 5 Mar 2009 at 9:27, Simon Michael wrote:

> Yow.. well if it works, great. :) > > Can you describe briefly the end goal again ?

Yes. The masterinclude file defines a set of links in RST format. All other files in the wiki then use those links without having to define them individually. Now to update the links one simply changes masterinclude and the rest of the wiki updates itself.

I'm kinda surprised this isn't standard functionality in wikis but no matter.

I've been doing some performance and speed tests with my brute force master inclusion system which of course completely bypasses the zwiki prerendering system. I thought I should report my results ...

I am running zwiki on plone but behind a CacheFu? controlled "varnish" reverse proxy - varnish is very seriously impressive and as I mentioned in my "Setting up Plone on a low end VPS" guide, my piddling 256Mb RAM VPS can handle being slashdotted (>1000 concurrent requests) with varnish running despite it using around 400Mb of RAM (192Mb of swap!)

To make use of varnish, you need a minimum of conditional HTTP working. I turned on conditional HTTP gets - the zwiki code reports the datestamp for the page which is to be expected, but this ignores the masterinclude. I can't see any way of overriding this date in the zwiki source - would adding this facility be a good idea as who knows what templates may do with rendering zwiki pages? If they had a publicly available method to call it would be useful.

BTW, you say that conditional GET doesn't work with Firefox so you're getting stale content during edits - but you're not sending "Cache- Control: must-revalidate" like you must if you want Firefox to always ask if cached items have been updated. Firefox is standards correct here, IE and others are broken: if you return a Last-Modified, the browser can not bother requesting anything at all ever again until its own cache expires.

In fact, if you want to be really sure you have the conditional GET correct, this is the fairly standard form:

Cache-Control: max-age=86400, s-maxage=86400, public, must- revalidate, proxy-revalidate

This sets a maximum age of one day in the cache plus forces a 304 check for the item each and every time. This will solve your "edits in Firefox" problem and you can probably turn on conditional GET by default now.

It also solves my masterinclude problem because when I update masterinclude, the page cache will always be refreshed once a day.

Cheers, Niall

Re: zope/plone article --Simon Michael, Fri, 06 Mar 2009 08:07:23 -0800 reply

Morning Niall,

I'm glad you're digging into this. First, fyi generally Zwiki's philosophy is to "just work" for the mainstream 80% of uses, and not prevent further customisation but not fill up the codebase and development time with it either. Now -

The caching investigation is very much appreciated, I assume you're referring to some issue page or other. It's over my head just now and not causing me problems in daily zwiki usage so I'll have to say "patches welcome".

> Yes. The masterinclude file defines a set of links in RST format. All > other files in the wiki then use those links without having to define > them individually. Now to update the links one simply changes > masterinclude and the rest of the wiki updates itself.

I see, thanks. I guess this might be important if your wiki has a lot of outgoing links to the same site(s). This is exactly what RemoteWikiLinks are for. I think you don't like those because the definitions are not gathered on page, and because you don't like the linking syntax (you must have come up with something equivalent though ?)

FWIW, in the past I found RemoteWikiLinks weren't worth the bother, because it was one more non-standard linking scheme for people to learn and it obscured where the link really went - ordinary bare urls or manual hyperlinks were clearer.

If you're into RST, maybe another option is to have the format method in rst.py append the contents of the masterinclude page before RST rendering. And on that page, define the link targets, RST footnote- style. This forces you to define each individual url though, not just a base url like RemoteWikiLinks. So probably not worth it.

> I am running zwiki on plone but behind a CacheFu? controlled "varnish" > reverse proxy - varnish is very seriously impressive and as I > mentioned in my "Setting up Plone on a low end VPS" guide, my > piddling 256Mb RAM VPS can handle being slashdotted (>1000 concurrent > requests) with varnish running despite it using around 400Mb of RAM > (192Mb of swap!)

Yes it was good to see your detailed numbers in your write-up.

> To make use of varnish, you need a minimum of conditional HTTP > working. I turned on conditional HTTP gets - the zwiki code reports > the datestamp for the page which is to be expected, but this ignores > the masterinclude. I can't see any way of overriding this date in the > zwiki source - would adding this facility be a good idea as who knows > what templates may do with rendering zwiki pages? If they had a > publicly available method to call it would be useful.

Yes that might be worthwhile. Though it's a bit of a small niche right now.

Re: zope/plone article --Niall Douglas, Fri, 06 Mar 2009 11:13:26 -0800 reply

On 6 Mar 2009 at 8:07, Simon Michael wrote:

> The caching investigation is very much appreciated, I assume you're > referring to some issue page or other. It's over my head just now and > not causing me problems in daily zwiki usage so I'll have to say > "patches welcome".

Ah sorry - it's issue number #1316 (http://zwiki.org/1316FireFoxDoesntLoadChangesAfterEditingWith304Enabl ed) which I found from http://zwiki.org/HowToEnableConditionalHTTPGET304.

I didn't post a patch because the fix is ridiculously easy: in ZWikiPage.py:handle_modified_headers every time where it does:

RESPONSE.setHeader('Last-Modified', rfc1123_date(last_mod))

... simply do:

RESPONSE.setHeader('Last-Modified', rfc1123_date(last_mod)) RESPONSE.setHeader('Cache-Control', 'max-age=86400, s-

maxage=86400, public, must-revalidate, proxy-revalidate')

I'd do a find for 'Last-Modified' and add the Cache-Control as required.

> FWIW, in the past I found RemoteWikiLinks weren't worth the bother, > because it was one more non-standard linking scheme for people to > learn and it obscured where the link really went - ordinary bare urls > or manual hyperlinks were clearer.

It's an issue of maintainability, then appearance yes.

> If you're into RST, maybe another option is to have the format method > in rst.py append the contents of the masterinclude page before RST > rendering. And on that page, define the link targets, RST footnote- > style. This forces you to define each individual url though, not just > a base url like RemoteWikiLinks. So probably not worth it.

It also means altering the source which isn't available to me. Similarly I'm stuck on 0.60 of zwiki because that's the latest in Ubuntu.

> > To make use of varnish, you need a minimum of conditional HTTP > > working. I turned on conditional HTTP gets - the zwiki code reports > > the datestamp for the page which is to be expected, but this ignores > > the masterinclude. I can't see any way of overriding this date in the > > zwiki source - would adding this facility be a good idea as who knows > > what templates may do with rendering zwiki pages? If they had a > > publicly available method to call it would be useful. > > Yes that might be worthwhile. Though it's a bit of a small niche right > now.

True.

I have another bug - this time in Preview mode where it isn't previewing correctly because the temporary page object isn't copying its attributes from its master page (e.g. "allow_dtml=1"). In common.py:renderText():

# make a new page object, like in create p = page.__class__(__name__=page.getId()) p.title = page.pageName() p = p.__of__(page.aq_parent) p.setPageType(self.id()) p.setText(text) return p.render(

Here it sets up the temporary page object and it doesn't copy over attributes. Therefore, previewing a page with DTML enabled just for that page doesn't work.

Fraid I have no idea how to copy across attributes in Zope. p.setattr('allow_dtml', page.getattr('allow_dtml',0)) is fairly obvious but I couldn't figure out how to iterate all attributes.

Cheers, Niall

Re: zope/plone article --betabug, Fri, 06 Mar 2009 11:28:49 -0800 reply

> ... simply do: > > RESPONSE.setHeader('Last-Modified', rfc1123_date(last_mod)) > RESPONSE.setHeader('Cache-Control', 'max-age=86400, s-

Sorry, that is just not the proper way to do these things. For setting the Cache-Control header you use an "Accelerated HTTP Cache Manager" object. Hardcoding something like that in the code is not good style.

Re: zope/plone article --Simon Michael, Fri, 06 Mar 2009 11:55:56 -0800 reply

> I have another bug - this time in Preview mode where it isn't > previewing correctly because the temporary page object isn't copying > its attributes from its master page (e.g. "allow_dtml=1"). In > common.py:renderText(): > > # make a new page object, like in create > p = page.__class__(__name__=page.getId()) > p.title = page.pageName() > p = p.__of__(page.aq_parent) > p.setPageType(self.id()) > p.setText(text) > return p.render( > > Here it sets up the temporary page object and it doesn't copy over > attributes. Therefore, previewing a page with DTML enabled just for > that page doesn't work. > > Fraid I have no idea how to copy across attributes in Zope. > p.setattr('allow_dtml', page.getattr('allow_dtml',0)) is fairly > obvious but I couldn't figure out how to iterate all attributes.

True, thank you, forwarding to the issue tracker.

Re: zope/plone article --Simon Michael, Fri, 06 Mar 2009 12:32:38 -0800 reply

> For setting the Cache-Control header you use an "Accelerated HTTP > Cache Manager" object. Hardcoding something like that in the code is > not good style.

So if someone wants to make his zwiki-behind-varnish screamingly fast, what should he do ? Is it possible today without product code changes ?

Because, y'know, I'm starting to think it should be done for zwiki.org. If it doesn't screw anything up.

Re: zope/plone article --Niall Douglas, Fri, 06 Mar 2009 16:47:22 -0800 reply

On 6 Mar 2009 at 11:28, betabug wrote:

> > ... simply do: > > > > RESPONSE.setHeader('Last-Modified', rfc1123_date(last_mod)) > > RESPONSE.setHeader('Cache-Control', 'max-age=86400, s- > > Sorry, that is just not the proper way to do these things. For setting > the Cache-Control header you use an "Accelerated HTTP Cache Manager" > object. Hardcoding something like that in the code is not good style.

Umm, the Last-Modified (conditional get) feature is ALREADY hardcoded. Adding Cache-Control is a bug fix i.e. making it work as it was thought Last-Modified should work.

IMHO adding Accelerated HTTP Cache Manager support is a feature request and a totally separate issue.

Cheers, Niall

tweaking caching headers --simon, Fri, 06 Mar 2009 23:19:18 -0800 reply

I am tweaking the caching setup on zwiki.org, so if you notice any bugs (stale content when viewing or editing) please let me know. I installed Varnish for testing (not yet on port 80). I installed an accelerated http cache manager, but I don't think it's associated with anything (tried to search for cacheable content, gave up). Then I played around with the http headers in ZWikiPage.py. No great speedup yet. I haven't gone and studied up on the latest caching techniques, but it looks like I'll need to. Unless.. Patches Welcome!

tweaking caching headers --simon, Fri, 06 Mar 2009 23:23:08 -0800 reply

To be more specific: I also set conditional_http_get=True, condition_http_get_ignore=[] on the folder, and I changed the code to always set a Cache-Control header.

Re: zope/plone article --betabug, Sat, 07 Mar 2009 01:16:46 -0800 reply

> Umm, the Last-Modified (conditional get) feature is ALREADY hardcoded.

Sorry, no, it isn't hardcoded at all. First of all it's an optional setting. Second, even if activated, Last-Modified uses, well, the "last modified" time, which by definition isn't a hard set value, like "max-age=86400" is. Someone might want to use another number for 86400, but "last time modified" isn't something that can be changed to another number.

> Adding Cache-Control is a bug fix i.e. making it work as it was thought Last-Modified should work.

No, the two can be combined, but do not have to be combined. Cache-Control is very much an admin defined setting.

> IMHO adding Accelerated HTTP Cache Manager support is a feature request and a totally separate issue.

It's something that already works, set one up, associate wiki pages with it, done.

I think the confusion arises because of your suggestion to use must-revalidate in the cache-control header. That is a good suggestion, but first, it would have to be optional (maybe combined with the Last-Modified setting) and second, no way am I going to agree to a hardcoded value for max-age just because we want to set must-revalidate. Different sites may have vastly different notions of what is an acceptable value for max-age - if they want it set at all.

As I said before, the proper Zope way for the cache-control settings is to have these controls in an object where they can be set on a site-by-site basis. That's the way Plone does it (with their cache-fu tools), that's the way "Accelerated HTTP Cache Manager" is meant to be used.

tweaking caching headers --Niall Douglas, Sat, 07 Mar 2009 02:13:11 -0800 reply

On 6 Mar 2009 at 23:19, simon wrote:

> I am tweaking the caching setup on zwiki.org, so if you notice any bugs > (stale content when viewing or editing) please let me know. I installed > Varnish for testing (not yet on port 80). I installed an accelerated > http cache manager, but I don't think it's associated with anything > (tried to search for cacheable content, gave up). Then I played around > with the http headers in ZWikiPage.py. No great speedup yet. I haven't > gone and studied up on the latest caching techniques, but it looks like > I'll need to. Unless.. Patches Welcome! > > To be more specific: I also set conditional_http_get=True, > condition_http_get_ignore=[] on the folder, and I changed the code to > always set a Cache-Control header.

Firstly make sure you're using v2 of varnish rather than v1. v1 works, but it has idiosyncracies ...

Secondly, you definitely want the "Live HTTP headers" plugin for Firefox. You also want to enable the telnet based "control" port for varnish which is very useful for finding out what's going on while it runs. I'd use Firefox to inspect the headers coming out of Zope directly (before going into varnish) as well as coming out of varnish.

Other than that, make sure Zope isn't adding a "Vary" header to zwiki output which is preventing caching. The Last-Modified should be enough otherwise. As I noted on my "low end Plone VPS" page, the speedup is extremely significant once working.

Oh and make sure Advanced HTTP Cache Manager isn't sending purges to the control interface for every page. You can check that using the control telnet interface.

Good luck!

Niall

Re: zope/plone article --Niall Douglas, Sat, 07 Mar 2009 02:14:12 -0800 reply

On 7 Mar 2009 at 1:17, betabug wrote:

> I think the confusion arises because of your suggestion to use > must-revalidate in the cache-control header. That is a good > suggestion, but first, it would have to be optional (maybe combined with > the Last-Modified setting) and second, no way am I going to agree to a > hardcoded value for max-age just because we want to set > must-revalidate. Different sites may have vastly different notions > of what is an acceptable value for max-age - if they want it set at all.

That's definitely where the confusion was arising - I had taken it from your original email that you didn't want conditional GET fixing because you felt adding Accelerated HTTP Cache Manager support was a superior solution.

Now I have no idea whether adding support for it or not is hard. I was merely concerned with conditional GET working as intended.

> As I said before, the proper Zope way for the cache-control settings is > to have these controls in an object where they can be set on a > site-by-site basis. That's the way Plone does it (with their cache-fu > tools), that's the way "Accelerated HTTP Cache Manager" is meant to be > used.

I do absolutely agree that the CacheFu? approach is superior. It appears to use per-user unique cookie values in the HTTP headers to operate a per-user cache which persists across days (or even weeks) - very clever. It even seems to know when a page has been updated and which bits to expunge from the cache.

I don't even pretend to know how it works - I turned it on and watched the HTTP headers and it's outstanding. TBH before CacheFu? I wasn't even aware HTTP 1.1 had such abilities.

Cheers, Niall

Plone with membrane --Michael Ang, Thu, 19 Mar 2009 00:38:17 -0700 reply

ZWiki does not support Products.membrane and Products.remember, add subscription doesn't work. Like like I have to get my hand dirty.

Plone with membrane --Simon Michael, Thu, 19 Mar 2009 07:44:57 -0700 reply

Welcome Michael. It sounds like you know more/are going to work on this, please keep us posted.

still here --Simon Michael, Wed, 01 Apr 2009 14:23:44 -0700 reply

Hey all.. I've been quiet for a while, this is because of looming tax deadlines. Also, I would have sent some kind of april fool's joke, but there's just too much to do, in particular I'm excited about the Zope 4 announcement today and have been digging around to see how quickly Zwiki can be brought up to speed for deployment on that platform. Exciting times!

Advice Needed: Migrating from ZWiki to MediaWiki? --morrisryanc, Thu, 06 Aug 2009 09:06:02 -0700 reply

Has anyone successfully transferred all of their content from ZWiki to MediaWiki?? We're currently trying to do that here at my work.

Re: --Dirk WESSEL, Thu, 06 Aug 2009 09:08:18 -0700 reply

Sorry, but I'am not in the office. I'll be back on the 15.08.2009.

In case you need CMB support please address your message to cm@nacma.nato.int.

Thanks Dirk Wessel

How to unsuscribe? --Patricia Goldweic, Mon, 31 Aug 2009 11:24:07 -0700 reply

Could somebody please remind me how to unsuscribe from this mailing list? Thanks in advance, -Patricia

<mailto:pgoldweic@northwestern.edu> pgoldweic@northwestern.edu

How to unsuscribe? --Simon Michael, Mon, 31 Aug 2009 11:56:40 -0700 reply

Hi Patricia. Generally: click the page link at the bottom, then click "subscribe" at top right, then click one or both unsubscribe buttons you may see. I'll unsubscribe you after sending this, so this should be the last you see.

Zwiki 2.0b1 released --Simon Michael, Fri, 23 Oct 2009 17:49:47 -0700 reply

I'm pleased to announce the first beta of Zwiki 2, formerly the unstable/unicode/skingeddon branch. This makes that work available in a release for the first time, and clarifies the status of Zwiki 2 as "the trunk".

In addition, this version supports the Zope 2.12, python 2.6 and setuptools. It should be possible to "easy_install Zwiki" and get Zwiki and Zope 2.12 installed; I'm especially interested in having people try this so the kinks can be worked out. The traditional tarball and darcs methods are also available.

For more details, see

Thanks to Vladimír Linek for his 2.12 patch which got the ball rolling. Zope 2.12 is quite an exciting release in my humble opinion.

I welcome your testing, patches and collaboration! Best to all, -Simon

Zwiki 2.0b1 released --jmax, Sat, 24 Oct 2009 01:17:39 -0700 reply

Big congrats on this release, Simon. I'll be watching carefully, plotting a course for upgrades here. Thanks!

Puzzle over missing "edit" --Cameron, Wed, 11 Nov 2009 13:43:49 -0800 reply

I want help reconfiguring a Zwiki. I'll describe what I have, and what I want: I came to an existing Zope site. It appears to be 2.9.7-final ... Zwiki was already installed. I created a new Zwiki instance. Things look normal. I begin to create content--Zwiki pages. In the upper right, each such page has selections: "home changes contents help options".

THERE'S NO "... edit"!

What am I missing? How do I restore the missing "edit" selection"

Puzzle over missing "edit" --Simon Michael, Wed, 11 Nov 2009 21:34:55 -0800 reply

Hi. Visit .../ThePage?/editform . It may ask you to log in or provide more authentication. After you do that, the edit link should remain visible. If not, see if issue #1455 describes the problem.

better Zwiki Plone support, you can help --Simon Michael, Thu, 17 Dec 2009 19:00:08 -0800 reply

Hi all. I have set up a pledgie fundraising goal for improving Zwiki's support for current Plone versions, and blogged about at http://zwiki.org/FundraisingForPloneSupport . If this is something you need, funding (or patches) will make it happen. Thanks!

-Simon