Age | Commit message (Collapse) | Author |
|
* Rename --getctime to --gettime. (The old name still works for
backwards compatability.)
* --gettime now also looks up last modification time.
* Add rcs_getmtime to plugin API; currently only implemented
for git.
|
|
|
|
|
|
The reason to do this is basically a user interaction design decision.
It is achieved by adding an entry, associated to the creating plugin, to
%pagestate. To find out if files were deleted a new global hash %del_hash is
%introduced.
|
|
This reverts commit 31680111f0062f07727d14fcf291c98978ad5a2f.
|
|
pagespec_translate may set $@ if it fails to parse a pagespec, but
due to memoization, this is not reliable. If a memoized call is repeated,
and $@ is already set for some other reason previously, it will remain
set through the call to pagespec_translate.
Instead, just check if pagespec_translate returns undef.
|
|
destdir, as well as wrappers and the .ikiwiki directory.
|
|
Use `_` to avoid superfluous stat.
Check for `defined $file`, instead of just `$file`.
Add spaces after commas.
Change return values of `verify_src_file()` to not return the tainted filename.
Rename `$f` to `$file_untainted in `verify_src_file()`.
$f changes to `$file` in `find_src_files()`.
This attempts to fix commit f3abeac919c4736429bd3362af6edf51ede8e7fe.
For discussion see
<http://ikiwiki.info/todo/auto-create_tag_pages_according_to_a_template/>
|
|
This also means that if autoadded files are deleted they will just be
recreated.
|
|
To make automatically added files render they have to be added to the $files,
$pages, $new, and $changed variables.
After that scan() is called on them.
|
|
This also has the advantage that I can use the resulting new function
elsewhere.
|
|
|
|
discussionpage setting.
Specifically, fixes discussion actions on discussion pages, and unbreaks the opendiscussion plugin.
|
|
|
|
bestlink was looking at whether %links existed for a page in order to tell
if the page exists, but just-deleted pages still have entries in there (for
reasons it may be best not to explore). So bestlink would return
just-deleted pages. Instead, make bestlink use %pagesources.
Also, when finding a deleted page, %pagecase was not cleared of that page.
This, again, made bestlink return just-deleted pages. Now that is cleared.
Fixing bestlink exposed another issue though. The backlink calculation code
uses bestlink. So when a page was deleted, no backlinks to it are found,
and pages that really did backlink to it were not updated, and had broken
links.
To fix that, the code that actually removes deleted pages had to be split
out from find_del_files, so it can run a bit later. It is run just after
backlinks are calculated. This way, backlink calculation still sees the
deleted pages, but everything afterwards does not.
However, it does not address the original bug report that started this
whole thing, [[bugs/bestlink_returns_deleted_pages]]. Because there
bestlink is run in the needsbuild hook. And that happens before backlink
calculation, and so bestlink still returns deleted pages then. Also in the
scan hook.
If bestlink needs to work consistently during those hooks, a more involved
fix will be needed.
|
|
template is filled out. This improves the search plugin's indexing, since it will not include navigational elements from the page template or sidebar.
|
|
files if a sourcedir of "./" was specified.
|
|
Conflicts:
IkiWiki.pm
IkiWiki/Render.pm
debian/changelog
|
|
Benchmarking refresh of a a wiki with 25 thousand pages showed
file_pruned() using most of the time. But, when refreshing, ikiwiki already
knows about nearly all the files. So we can skip calling file_pruned() for
those it knows about. While tricky to do, this sped up a refresh (that
otherwise does no work) by 10-50%.
|
|
|
|
|
|
This is very common, and the code has to test each type differently, since
the list of candidates to test, as well as the test, will vary per type.
Much happier with this code now.
|
|
Such a dependency is unlikely, but can happen.
|
|
Only left one new global
|
|
killed one global
|
|
I'd rather have the global variables than the 300 line function
|
|
This new method for determining when links on pages
have changed should be more efficient, since it avoids
double calculation of the bestlinks.
It also allows collecting data about the old links, before
the scan pass, so the data is accurate when pages move around
and bestlinks change.
Also got some code back to a saner indent level.
|
|
|
|
Involved some code refactoring so that same code that detects
link changes for backlinks updating can be used for link dependency
checking. The nice thing is that link dep checking is thus
comopletly free!
|
|
|
|
Simplify, change default content depends number to 1,
change interface to make more sense.
|
|
Conflicts:
IkiWiki/Render.pm
|
|
Preliminary support, anyway.
If a dependency only includes DEPEND_EXISTS, then only changes that
involved adding or deleting a page can trigger it.
This is complicated by internal pages, since the code did not previously
differentiate between add, delete, and change of internal pages.
Now it tracks change separately from add+delete, so DEPEND_EXISTS pagespecs
that actually match internal pages (which will probably be quite rare in
practice) should work.
|
|
As soon as a change happens, we know we will need to rescan all
dependencies from the start, so bail out of the current scan partway to
avoid doing redundant work.
Only problem with this is that ikiwiki sometimes ends up printing out
dependencies that, while correct, are not obvious. Before:
building B, which depends on A
building C, which depends on A
building D, which depends on A
After:
building B, which depends on A
building C, which depends on B
building D, which depends on C
|
|
This is a rather expensive solution to the transitive dependency problem.
|
|
Instead, use the change and delete hooks, and launch rsync if either hook
is called.
|
|
the last merge. I should really put each feature on its own git branch.
|
|
|
|
|
|
It's not "exact" since case munging has to be done, and I think
"simple" captures the optimisation better.</pedant>
With apologies to smcv, who probably has to rebuild his wiki now.
|
|
As per Joey's review
|
|
|
|
|
|
single page
Let E be the number of dependencies per page of the form "A depends on B and
nothing else", let D be the number of other dependencies per page,
let P be the total number of pages, and let C be the number of changed
pages in a refresh.
This patch should speed up a refresh from O(E*C*P + D*C*P) to
O(C + E*P + D*C*P), assuming that hash lookups are O(1).
In practice, plugins like inline and map produce a lot of these very simple
dependencies, and my album plugin's combination of inline with a large
number of pages causes it to suffer particularly badly.
In testing on a wiki with about 7000 objects (3500 full pages, 3500
images), a full rebuild continued to take about 5:30, and a refresh
after touching about 350 pages and 350 images reduced from 5:30 to 1:30.
As with my previous optimizations, this change will result in downgrades not
working correctly until the wiki is rebuilt.
|
|
This should be more efficient than pagespec_match_list since it short-circuits
after the first match is found.
The other problem with using pagespec_match_list here is it may throw an
error if a bad or failing pagespec somehow got into the dependencies.
|
|
|
|
|
|
On a large wiki you can spend a lot of time reading through large lists
of dependencies to see whether files need to be rebuilt (album, with its
one-page-per-photo arrangement, suffers particularly badly from this).
The dependency list is currently a single pagespec, but it's not used like
a normal pagespec - in practice, it's a list of pagespecs joined with the
"or" operator.
Accordingly, change it to be stored as a list of pagespecs. On a wiki
with many tagged photo albums, this reduces the time to refresh after
`touch tags/*.mdwn` from about 31 to 25 seconds.
Getting the benefit of this change on an existing wiki requires a rebuild.
|
|
assumption that uploading an entire site is efficient.
|
|
During backlink calulation, all links are examined and broken links can
be detected for free, so store a list of broken links and have brokenlinks
use it.
Exposing the %brokenlinks structure is a bit ugly, but the speedup seems
worth it: Around 1 second for wikis the size of the doc wiki that use
brokenlinks.
|