Compare commits
334 commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
08cb857664 | ||
|
|
309d8dc773 | ||
| fabdefad82 | |||
|
|
616b1d9e3e | ||
|
|
7a7dec7751 | ||
|
|
28dec3694a | ||
| 59b50e7f4b | |||
|
|
a261ef841e | ||
|
|
109b70ea76 | ||
|
|
5341de86fb | ||
|
|
987277f1cf | ||
|
|
589fed6107 | ||
|
|
61ab45fc65 | ||
|
|
6c24e8a7d3 | ||
|
|
42bd1afb09 | ||
|
|
9ed5912461 | ||
|
|
5ce3fddf62 | ||
|
|
8f5d9b2c97 | ||
|
|
401dd4fa8e | ||
|
|
f818f011e3 | ||
| c9eef21e43 | |||
|
|
a5e565b5d6 | ||
|
|
2cefdaa1a5 | ||
|
|
74e3b26790 | ||
| dcc72aebf7 | |||
|
|
de720ecfe9 | ||
|
|
e44dede92f | ||
|
|
7701bdb0a8 | ||
|
|
c39205d308 | ||
|
|
2788de47b8 | ||
|
|
1e18e71b9d | ||
|
|
51dcf4a02f | ||
|
|
702cdf931e | ||
|
|
bfb3c6338c | ||
|
|
30abef474d | ||
|
|
ca6d06c2c3 | ||
|
|
1628ce58c7 | ||
|
|
10dcf076a9 | ||
| 086c95f98d | |||
|
|
2725a1ced4 | ||
|
|
835e21c56c | ||
|
|
71eabe2f23 | ||
|
|
5c1bf8a8de |
||
|
|
a63e16efc5 | ||
|
|
b47a828af0 | ||
|
|
420dc3fe7f | ||
|
|
51f468182f | ||
|
|
eeccc0b260 | ||
|
|
fc0b86005c | ||
|
|
bd0fe05003 | ||
|
|
890b166a43 | ||
|
|
e504d1f42b | ||
|
|
b0f1317713 | ||
|
|
04bf9aff39 | ||
|
|
07c925de5d | ||
|
|
89a82ba55e | ||
|
|
12b6b2cf28 | ||
|
|
fbe17cae44 | ||
|
|
6dca5c638d | ||
|
|
fe911dfcaa | ||
|
|
3049849ea6 | ||
|
|
a18a00e773 | ||
|
|
01c36fc699 | ||
|
|
1df57c8154 | ||
|
|
902a7d667e | ||
|
|
8fbb7ddf27 | ||
|
|
86d5628577 | ||
| 5356370233 | |||
|
|
9d8a8a5a15 | ||
| 977c7e9c1f | |||
|
|
e2ca223432 | ||
|
|
ca45e8429b | ||
|
|
20abe4f86b | ||
|
|
10d4942912 | ||
|
|
b5eede90c6 | ||
|
|
dc67c929cf | ||
|
|
e0d39084c6 | ||
|
|
e392792e1e | ||
|
|
a729e05132 | ||
|
|
1ee71a9dfb | ||
|
|
61f715515f | ||
|
|
d67b35da2c | ||
|
|
349a4845c0 | ||
|
|
ceb2f31e98 | ||
|
|
f7f6b4d885 | ||
|
|
bdd6b15397 | ||
|
|
dcb2b60c41 | ||
|
|
9bd09f24a2 | ||
|
|
d04972026f | ||
|
|
7b65687da5 | ||
|
|
aaf43af0d9 | ||
|
|
4675bf864c | ||
|
|
05de81e0f0 | ||
|
|
51db5e1204 | ||
|
|
8c376a31f5 | ||
|
|
c7d35464ae | ||
|
|
758e5a9059 | ||
|
|
410cf58067 | ||
|
|
a949a9e8e7 | ||
|
|
5066f417a9 | ||
|
|
093de8b0a1 | ||
|
|
0d5b2992c0 | ||
|
|
d2cd0c48f3 | ||
|
|
e39e53d0fb | ||
|
|
9cb8b99353 | ||
|
|
c10b46111b | ||
|
|
540d860e28 | ||
|
|
cbf34deade | ||
|
|
073b1138b7 | ||
|
|
9819c718b1 | ||
|
|
64824cb3b1 | ||
|
|
ac7c703ff0 | ||
|
|
b13ed10eaf | ||
|
|
43bba003b2 | ||
|
|
a8c34581ea | ||
|
|
fdb68e1ade | ||
|
|
63686c3598 | ||
|
|
72034cb042 | ||
|
|
e47fb78603 | ||
|
|
86992b9787 | ||
|
|
ca9106054b | ||
|
|
44442a09c9 | ||
|
|
23c395bcbd | ||
|
|
9c47412991 | ||
|
|
0b8e6409cf | ||
|
|
f630bef3b5 | ||
|
|
545158b56f | ||
|
|
f98853379e | ||
|
|
ccf86defaf | ||
|
|
49cb1f92ee | ||
|
|
e9c02b8e2d | ||
|
|
7de72e5e8d | ||
|
|
564ef0bd87 | ||
|
|
c2226f9134 | ||
|
|
0f7891de8d | ||
|
|
a48fe6d41b | ||
|
|
d79c2b0a28 | ||
|
|
e508ef052f | ||
|
|
8ae649c761 | ||
|
|
5d0a3c73d2 | ||
|
|
e18c9fa94d | ||
|
|
49ef5871bc | ||
|
|
afe732a69f | ||
|
|
ec61905fc4 | ||
|
|
7e155f4219 | ||
|
|
8b01949f4b | ||
|
|
6a8ff3f1d2 | ||
|
|
e1ccba4f57 | ||
|
|
7d1669827a | ||
|
|
c314f284a2 | ||
|
|
375f9ebafe | ||
|
|
cf62a0fada | ||
|
|
e6d89d57fc | ||
|
|
e4dea56e49 | ||
|
|
59c8941373 | ||
|
|
9e801582d5 | ||
|
|
435f2140da | ||
|
|
145c6de2fb | ||
|
|
1b591a040c | ||
|
|
2a6ab4c125 | ||
|
|
39e8eb4c94 | ||
|
|
9d65e1b1bf | ||
|
|
ef4291c722 | ||
|
|
086af1497c | ||
|
|
8a70e30d97 | ||
|
|
4865f3c8c6 | ||
|
|
3d92f30574 | ||
|
|
67aead1db2 | ||
|
|
4af94d9e71 | ||
|
|
84c732bfc0 | ||
|
|
ce6157efee | ||
|
|
08331387b7 | ||
|
|
262ceabca3 | ||
|
|
3d571e915a | ||
|
|
1c8a0808a6 | ||
|
|
d7c4834dce | ||
|
|
e8a977720f | ||
|
|
229e251482 | ||
|
|
422b220494 | ||
|
|
36f413173a | ||
|
|
9d94beec68 | ||
|
|
23b2fcee08 | ||
|
|
25d8a2e2fb | ||
|
|
9eba4e8b8e | ||
|
|
5d41dea79d | ||
|
|
93b56025fa | ||
|
|
59f880ea86 | ||
|
|
1f84f87888 | ||
|
|
f549334226 | ||
|
|
26f4ae777b | ||
|
|
8cc833a7b1 | ||
|
|
f17023255c | ||
|
|
0f7a6a88ef | ||
|
|
b749a05473 | ||
|
|
a08580fe7e | ||
|
|
3abea41ffa | ||
|
|
56879ed9fb | ||
|
|
cf1459251e | ||
|
|
48e181674a | ||
|
|
02670a813c | ||
|
|
3231acf525 | ||
|
|
3431646fba | ||
|
|
ab74059c77 | ||
|
|
aeb3ff6d89 | ||
|
|
852155a4db | ||
|
|
4696590a73 | ||
|
|
ba11374c6f | ||
|
|
4b560f64f4 | ||
|
|
50dcd80b85 | ||
|
|
4ba0dab260 | ||
|
|
e6a0300b9b | ||
|
|
9aa111d21b | ||
|
|
8a56d41ebc | ||
|
|
87e15be561 | ||
|
|
77ea2b513f | ||
|
|
4e3c5922ee | ||
|
|
428670b4e7 | ||
|
|
c2bc225dc0 | ||
|
|
483cfee90c | ||
|
|
cbf1b7cf0e | ||
|
|
b014c00d24 | ||
|
|
d61c9625f4 | ||
|
|
65c34deb43 | ||
|
|
2b890b0316 | ||
|
|
1b1e0ebc8b | ||
|
|
e136b132e6 | ||
|
|
b44fdb6107 | ||
|
|
e390c62072 | ||
|
|
ef641b5e31 | ||
|
|
c9472c5dff | ||
|
|
ed30a4d25b | ||
|
|
8a87e7effa | ||
|
|
f5e1920a15 | ||
|
|
8491a54b0d | ||
|
|
debdd9d004 | ||
|
|
37dcda8e3b | ||
|
|
3ea39547a7 | ||
|
|
017b80f59b | ||
|
|
19bc94a756 | ||
|
|
8423fa136f | ||
|
|
5da5506c65 | ||
|
|
31ef31801e | ||
|
|
09ed55a17e | ||
|
|
a9aad1ed8f | ||
|
|
65a7badd4d | ||
|
|
bd1e365ca0 | ||
|
|
ca1231a576 | ||
|
|
de6a7893fc | ||
|
|
194148b5b3 | ||
|
|
6fd22b6416 | ||
|
|
72830a4647 | ||
|
|
76e6094247 | ||
|
|
0b064e0565 | ||
|
|
dd7a047408 | ||
|
|
010449bafa | ||
|
|
97df2c14de | ||
|
|
923350907d | ||
|
|
544a7d269e | ||
|
|
cd643f1c0b | ||
|
|
64b5ae8b03 | ||
|
|
142fd24fc8 | ||
|
|
5477b39a25 | ||
|
|
c9439be432 | ||
|
|
2b28567588 | ||
|
|
cbff479c65 | ||
|
|
d6ea94c6dc | ||
|
|
4db6b78a13 | ||
|
|
10d420614c | ||
|
|
ea16af51b2 | ||
|
|
a4c61e7b68 | ||
|
|
192b55b98d | ||
|
|
122982b0b9 | ||
|
|
f2aef9d4dd | ||
|
|
b121e545f7 | ||
|
|
f4f9564bde | ||
|
|
6595fe7b97 | ||
|
|
2b31f8f229 | ||
|
|
02264020f5 | ||
|
|
382c159b41 | ||
|
|
ba538ea623 | ||
|
|
62e8190a94 | ||
|
|
bce57a1ac1 | ||
|
|
a030a98982 | ||
|
|
ae23a04925 | ||
|
|
8cc0f04874 | ||
|
|
27c838163a | ||
|
|
38ca0a1546 | ||
|
|
6de3c9877c | ||
|
|
1943bfd244 | ||
|
|
bf22075407 | ||
|
|
00c1303fb2 | ||
|
|
0516749a0c | ||
|
|
c133ba5df9 | ||
|
|
af242905af | ||
|
|
b3ad86f270 | ||
|
|
337fec115b | ||
|
|
e5e1cc865a | ||
|
|
bfcc1ebfc4 | ||
|
|
b8b0ca107a | ||
|
|
582018adbb | ||
|
|
d7ec63052a | ||
|
|
4f088f84c3 | ||
|
|
c704d7d9b8 | ||
|
|
05544bf02f | ||
|
|
218f846479 | ||
|
|
c5545e3c45 | ||
|
|
ba61d0b6ce | ||
|
|
1a8752814f | ||
|
|
7cc669ba0a | ||
|
|
45d8560110 | ||
|
|
26b8df4825 | ||
|
|
cfe5433e7a | ||
|
|
6af6c8d229 | ||
|
|
ca408a22cc | ||
|
|
5809571ca0 | ||
|
|
40cef5f7a4 | ||
|
|
fef686b449 | ||
|
|
5caad132b5 | ||
|
|
8f2af0e466 | ||
|
|
a9dd91ffef | ||
|
|
cdebda1c32 | ||
|
|
085ba48497 | ||
|
|
d5c7b57b09 | ||
|
|
5775bdb28d | ||
|
|
c01e6b9a16 | ||
|
|
2dc80fa971 | ||
|
|
ca5c9fc5fd | ||
|
|
2627af97ad | ||
|
|
741650f926 | ||
|
|
f096e412ea | ||
|
|
e8fc763b5c | ||
|
|
6ea6e23df0 | ||
|
|
de16d9556a | ||
|
|
c60e2870c4 |
195 changed files with 6545 additions and 637 deletions
1
.gitignore
vendored
1
.gitignore
vendored
|
|
@ -8,7 +8,6 @@ doc/man/*.html
|
||||||
doc/man/*.htm
|
doc/man/*.htm
|
||||||
doc/man/*.texi
|
doc/man/*.texi
|
||||||
doc/man/*.man
|
doc/man/*.man
|
||||||
test/*
|
|
||||||
.*.swp
|
.*.swp
|
||||||
doc/man/*.[0-9]
|
doc/man/*.[0-9]
|
||||||
doc/*.xml
|
doc/*.xml
|
||||||
|
|
|
||||||
12
.gitlab-ci.yml
Normal file
12
.gitlab-ci.yml
Normal file
|
|
@ -0,0 +1,12 @@
|
||||||
|
stages:
|
||||||
|
- test
|
||||||
|
|
||||||
|
unit_tests:
|
||||||
|
stage: test
|
||||||
|
script:
|
||||||
|
- make test
|
||||||
|
|
||||||
|
shellcheck:
|
||||||
|
stage: test
|
||||||
|
script:
|
||||||
|
- make shellcheck
|
||||||
757
COPYING
757
COPYING
|
|
@ -1,165 +1,674 @@
|
||||||
GNU LESSER GENERAL PUBLIC LICENSE
|
GNU GENERAL PUBLIC LICENSE
|
||||||
Version 3, 29 June 2007
|
Version 3, 29 June 2007
|
||||||
|
|
||||||
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
|
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
|
||||||
Everyone is permitted to copy and distribute verbatim copies
|
Everyone is permitted to copy and distribute verbatim copies
|
||||||
of this license document, but changing it is not allowed.
|
of this license document, but changing it is not allowed.
|
||||||
|
|
||||||
|
Preamble
|
||||||
|
|
||||||
This version of the GNU Lesser General Public License incorporates
|
The GNU General Public License is a free, copyleft license for
|
||||||
the terms and conditions of version 3 of the GNU General Public
|
software and other kinds of works.
|
||||||
License, supplemented by the additional permissions listed below.
|
|
||||||
|
|
||||||
0. Additional Definitions.
|
The licenses for most software and other practical works are designed
|
||||||
|
to take away your freedom to share and change the works. By contrast,
|
||||||
|
the GNU General Public License is intended to guarantee your freedom to
|
||||||
|
share and change all versions of a program--to make sure it remains free
|
||||||
|
software for all its users. We, the Free Software Foundation, use the
|
||||||
|
GNU General Public License for most of our software; it applies also to
|
||||||
|
any other work released this way by its authors. You can apply it to
|
||||||
|
your programs, too.
|
||||||
|
|
||||||
As used herein, "this License" refers to version 3 of the GNU Lesser
|
When we speak of free software, we are referring to freedom, not
|
||||||
General Public License, and the "GNU GPL" refers to version 3 of the GNU
|
price. Our General Public Licenses are designed to make sure that you
|
||||||
General Public License.
|
have the freedom to distribute copies of free software (and charge for
|
||||||
|
them if you wish), that you receive source code or can get it if you
|
||||||
|
want it, that you can change the software or use pieces of it in new
|
||||||
|
free programs, and that you know you can do these things.
|
||||||
|
|
||||||
"The Library" refers to a covered work governed by this License,
|
To protect your rights, we need to prevent others from denying you
|
||||||
other than an Application or a Combined Work as defined below.
|
these rights or asking you to surrender the rights. Therefore, you have
|
||||||
|
certain responsibilities if you distribute copies of the software, or if
|
||||||
|
you modify it: responsibilities to respect the freedom of others.
|
||||||
|
|
||||||
An "Application" is any work that makes use of an interface provided
|
For example, if you distribute copies of such a program, whether
|
||||||
by the Library, but which is not otherwise based on the Library.
|
gratis or for a fee, you must pass on to the recipients the same
|
||||||
Defining a subclass of a class defined by the Library is deemed a mode
|
freedoms that you received. You must make sure that they, too, receive
|
||||||
of using an interface provided by the Library.
|
or can get the source code. And you must show them these terms so they
|
||||||
|
know their rights.
|
||||||
|
|
||||||
A "Combined Work" is a work produced by combining or linking an
|
Developers that use the GNU GPL protect your rights with two steps:
|
||||||
Application with the Library. The particular version of the Library
|
(1) assert copyright on the software, and (2) offer you this License
|
||||||
with which the Combined Work was made is also called the "Linked
|
giving you legal permission to copy, distribute and/or modify it.
|
||||||
Version".
|
|
||||||
|
|
||||||
The "Minimal Corresponding Source" for a Combined Work means the
|
For the developers' and authors' protection, the GPL clearly explains
|
||||||
Corresponding Source for the Combined Work, excluding any source code
|
that there is no warranty for this free software. For both users' and
|
||||||
for portions of the Combined Work that, considered in isolation, are
|
authors' sake, the GPL requires that modified versions be marked as
|
||||||
based on the Application, and not on the Linked Version.
|
changed, so that their problems will not be attributed erroneously to
|
||||||
|
authors of previous versions.
|
||||||
|
|
||||||
The "Corresponding Application Code" for a Combined Work means the
|
Some devices are designed to deny users access to install or run
|
||||||
object code and/or source code for the Application, including any data
|
modified versions of the software inside them, although the manufacturer
|
||||||
and utility programs needed for reproducing the Combined Work from the
|
can do so. This is fundamentally incompatible with the aim of
|
||||||
Application, but excluding the System Libraries of the Combined Work.
|
protecting users' freedom to change the software. The systematic
|
||||||
|
pattern of such abuse occurs in the area of products for individuals to
|
||||||
|
use, which is precisely where it is most unacceptable. Therefore, we
|
||||||
|
have designed this version of the GPL to prohibit the practice for those
|
||||||
|
products. If such problems arise substantially in other domains, we
|
||||||
|
stand ready to extend this provision to those domains in future versions
|
||||||
|
of the GPL, as needed to protect the freedom of users.
|
||||||
|
|
||||||
1. Exception to Section 3 of the GNU GPL.
|
Finally, every program is threatened constantly by software patents.
|
||||||
|
States should not allow patents to restrict development and use of
|
||||||
|
software on general-purpose computers, but in those that do, we wish to
|
||||||
|
avoid the special danger that patents applied to a free program could
|
||||||
|
make it effectively proprietary. To prevent this, the GPL assures that
|
||||||
|
patents cannot be used to render the program non-free.
|
||||||
|
|
||||||
You may convey a covered work under sections 3 and 4 of this License
|
The precise terms and conditions for copying, distribution and
|
||||||
without being bound by section 3 of the GNU GPL.
|
modification follow.
|
||||||
|
|
||||||
2. Conveying Modified Versions.
|
TERMS AND CONDITIONS
|
||||||
|
|
||||||
If you modify a copy of the Library, and, in your modifications, a
|
0. Definitions.
|
||||||
facility refers to a function or data to be supplied by an Application
|
|
||||||
that uses the facility (other than as an argument passed when the
|
|
||||||
facility is invoked), then you may convey a copy of the modified
|
|
||||||
version:
|
|
||||||
|
|
||||||
a) under this License, provided that you make a good faith effort to
|
"This License" refers to version 3 of the GNU General Public License.
|
||||||
ensure that, in the event an Application does not supply the
|
|
||||||
function or data, the facility still operates, and performs
|
|
||||||
whatever part of its purpose remains meaningful, or
|
|
||||||
|
|
||||||
b) under the GNU GPL, with none of the additional permissions of
|
"Copyright" also means copyright-like laws that apply to other kinds of
|
||||||
this License applicable to that copy.
|
works, such as semiconductor masks.
|
||||||
|
|
||||||
3. Object Code Incorporating Material from Library Header Files.
|
"The Program" refers to any copyrightable work licensed under this
|
||||||
|
License. Each licensee is addressed as "you". "Licensees" and
|
||||||
|
"recipients" may be individuals or organizations.
|
||||||
|
|
||||||
The object code form of an Application may incorporate material from
|
To "modify" a work means to copy from or adapt all or part of the work
|
||||||
a header file that is part of the Library. You may convey such object
|
in a fashion requiring copyright permission, other than the making of an
|
||||||
code under terms of your choice, provided that, if the incorporated
|
exact copy. The resulting work is called a "modified version" of the
|
||||||
material is not limited to numerical parameters, data structure
|
earlier work or a work "based on" the earlier work.
|
||||||
layouts and accessors, or small macros, inline functions and templates
|
|
||||||
(ten or fewer lines in length), you do both of the following:
|
|
||||||
|
|
||||||
a) Give prominent notice with each copy of the object code that the
|
A "covered work" means either the unmodified Program or a work based
|
||||||
Library is used in it and that the Library and its use are
|
on the Program.
|
||||||
covered by this License.
|
|
||||||
|
|
||||||
b) Accompany the object code with a copy of the GNU GPL and this license
|
To "propagate" a work means to do anything with it that, without
|
||||||
document.
|
permission, would make you directly or secondarily liable for
|
||||||
|
infringement under applicable copyright law, except executing it on a
|
||||||
|
computer or modifying a private copy. Propagation includes copying,
|
||||||
|
distribution (with or without modification), making available to the
|
||||||
|
public, and in some countries other activities as well.
|
||||||
|
|
||||||
4. Combined Works.
|
To "convey" a work means any kind of propagation that enables other
|
||||||
|
parties to make or receive copies. Mere interaction with a user through
|
||||||
|
a computer network, with no transfer of a copy, is not conveying.
|
||||||
|
|
||||||
You may convey a Combined Work under terms of your choice that,
|
An interactive user interface displays "Appropriate Legal Notices"
|
||||||
taken together, effectively do not restrict modification of the
|
to the extent that it includes a convenient and prominently visible
|
||||||
portions of the Library contained in the Combined Work and reverse
|
feature that (1) displays an appropriate copyright notice, and (2)
|
||||||
engineering for debugging such modifications, if you also do each of
|
tells the user that there is no warranty for the work (except to the
|
||||||
the following:
|
extent that warranties are provided), that licensees may convey the
|
||||||
|
work under this License, and how to view a copy of this License. If
|
||||||
|
the interface presents a list of user commands or options, such as a
|
||||||
|
menu, a prominent item in the list meets this criterion.
|
||||||
|
|
||||||
a) Give prominent notice with each copy of the Combined Work that
|
1. Source Code.
|
||||||
the Library is used in it and that the Library and its use are
|
|
||||||
covered by this License.
|
|
||||||
|
|
||||||
b) Accompany the Combined Work with a copy of the GNU GPL and this license
|
The "source code" for a work means the preferred form of the work
|
||||||
document.
|
for making modifications to it. "Object code" means any non-source
|
||||||
|
form of a work.
|
||||||
|
|
||||||
c) For a Combined Work that displays copyright notices during
|
A "Standard Interface" means an interface that either is an official
|
||||||
execution, include the copyright notice for the Library among
|
standard defined by a recognized standards body, or, in the case of
|
||||||
these notices, as well as a reference directing the user to the
|
interfaces specified for a particular programming language, one that
|
||||||
copies of the GNU GPL and this license document.
|
is widely used among developers working in that language.
|
||||||
|
|
||||||
d) Do one of the following:
|
The "System Libraries" of an executable work include anything, other
|
||||||
|
than the work as a whole, that (a) is included in the normal form of
|
||||||
|
packaging a Major Component, but which is not part of that Major
|
||||||
|
Component, and (b) serves only to enable use of the work with that
|
||||||
|
Major Component, or to implement a Standard Interface for which an
|
||||||
|
implementation is available to the public in source code form. A
|
||||||
|
"Major Component", in this context, means a major essential component
|
||||||
|
(kernel, window system, and so on) of the specific operating system
|
||||||
|
(if any) on which the executable work runs, or a compiler used to
|
||||||
|
produce the work, or an object code interpreter used to run it.
|
||||||
|
|
||||||
0) Convey the Minimal Corresponding Source under the terms of this
|
The "Corresponding Source" for a work in object code form means all
|
||||||
License, and the Corresponding Application Code in a form
|
the source code needed to generate, install, and (for an executable
|
||||||
suitable for, and under terms that permit, the user to
|
work) run the object code and to modify the work, including scripts to
|
||||||
recombine or relink the Application with a modified version of
|
control those activities. However, it does not include the work's
|
||||||
the Linked Version to produce a modified Combined Work, in the
|
System Libraries, or general-purpose tools or generally available free
|
||||||
manner specified by section 6 of the GNU GPL for conveying
|
programs which are used unmodified in performing those activities but
|
||||||
Corresponding Source.
|
which are not part of the work. For example, Corresponding Source
|
||||||
|
includes interface definition files associated with source files for
|
||||||
|
the work, and the source code for shared libraries and dynamically
|
||||||
|
linked subprograms that the work is specifically designed to require,
|
||||||
|
such as by intimate data communication or control flow between those
|
||||||
|
subprograms and other parts of the work.
|
||||||
|
|
||||||
1) Use a suitable shared library mechanism for linking with the
|
The Corresponding Source need not include anything that users
|
||||||
Library. A suitable mechanism is one that (a) uses at run time
|
can regenerate automatically from other parts of the Corresponding
|
||||||
a copy of the Library already present on the user's computer
|
Source.
|
||||||
system, and (b) will operate properly with a modified version
|
|
||||||
of the Library that is interface-compatible with the Linked
|
|
||||||
Version.
|
|
||||||
|
|
||||||
e) Provide Installation Information, but only if you would otherwise
|
The Corresponding Source for a work in source code form is that
|
||||||
be required to provide such information under section 6 of the
|
same work.
|
||||||
GNU GPL, and only to the extent that such information is
|
|
||||||
necessary to install and execute a modified version of the
|
|
||||||
Combined Work produced by recombining or relinking the
|
|
||||||
Application with a modified version of the Linked Version. (If
|
|
||||||
you use option 4d0, the Installation Information must accompany
|
|
||||||
the Minimal Corresponding Source and Corresponding Application
|
|
||||||
Code. If you use option 4d1, you must provide the Installation
|
|
||||||
Information in the manner specified by section 6 of the GNU GPL
|
|
||||||
for conveying Corresponding Source.)
|
|
||||||
|
|
||||||
5. Combined Libraries.
|
2. Basic Permissions.
|
||||||
|
|
||||||
You may place library facilities that are a work based on the
|
All rights granted under this License are granted for the term of
|
||||||
Library side by side in a single library together with other library
|
copyright on the Program, and are irrevocable provided the stated
|
||||||
facilities that are not Applications and are not covered by this
|
conditions are met. This License explicitly affirms your unlimited
|
||||||
License, and convey such a combined library under terms of your
|
permission to run the unmodified Program. The output from running a
|
||||||
choice, if you do both of the following:
|
covered work is covered by this License only if the output, given its
|
||||||
|
content, constitutes a covered work. This License acknowledges your
|
||||||
|
rights of fair use or other equivalent, as provided by copyright law.
|
||||||
|
|
||||||
a) Accompany the combined library with a copy of the same work based
|
You may make, run and propagate covered works that you do not
|
||||||
on the Library, uncombined with any other library facilities,
|
convey, without conditions so long as your license otherwise remains
|
||||||
conveyed under the terms of this License.
|
in force. You may convey covered works to others for the sole purpose
|
||||||
|
of having them make modifications exclusively for you, or provide you
|
||||||
|
with facilities for running those works, provided that you comply with
|
||||||
|
the terms of this License in conveying all material for which you do
|
||||||
|
not control copyright. Those thus making or running the covered works
|
||||||
|
for you must do so exclusively on your behalf, under your direction
|
||||||
|
and control, on terms that prohibit them from making any copies of
|
||||||
|
your copyrighted material outside their relationship with you.
|
||||||
|
|
||||||
b) Give prominent notice with the combined library that part of it
|
Conveying under any other circumstances is permitted solely under
|
||||||
is a work based on the Library, and explaining where to find the
|
the conditions stated below. Sublicensing is not allowed; section 10
|
||||||
accompanying uncombined form of the same work.
|
makes it unnecessary.
|
||||||
|
|
||||||
6. Revised Versions of the GNU Lesser General Public License.
|
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||||
|
|
||||||
The Free Software Foundation may publish revised and/or new versions
|
No covered work shall be deemed part of an effective technological
|
||||||
of the GNU Lesser General Public License from time to time. Such new
|
measure under any applicable law fulfilling obligations under article
|
||||||
versions will be similar in spirit to the present version, but may
|
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||||
differ in detail to address new problems or concerns.
|
similar laws prohibiting or restricting circumvention of such
|
||||||
|
measures.
|
||||||
|
|
||||||
Each version is given a distinguishing version number. If the
|
When you convey a covered work, you waive any legal power to forbid
|
||||||
Library as you received it specifies that a certain numbered version
|
circumvention of technological measures to the extent such circumvention
|
||||||
of the GNU Lesser General Public License "or any later version"
|
is effected by exercising rights under this License with respect to
|
||||||
applies to it, you have the option of following the terms and
|
the covered work, and you disclaim any intention to limit operation or
|
||||||
conditions either of that published version or of any later version
|
modification of the work as a means of enforcing, against the work's
|
||||||
published by the Free Software Foundation. If the Library as you
|
users, your or third parties' legal rights to forbid circumvention of
|
||||||
received it does not specify a version number of the GNU Lesser
|
technological measures.
|
||||||
General Public License, you may choose any version of the GNU Lesser
|
|
||||||
General Public License ever published by the Free Software Foundation.
|
|
||||||
|
|
||||||
If the Library as you received it specifies that a proxy can decide
|
4. Conveying Verbatim Copies.
|
||||||
whether future versions of the GNU Lesser General Public License shall
|
|
||||||
apply, that proxy's public statement of acceptance of any version is
|
You may convey verbatim copies of the Program's source code as you
|
||||||
permanent authorization for you to choose that version for the
|
receive it, in any medium, provided that you conspicuously and
|
||||||
Library.
|
appropriately publish on each copy an appropriate copyright notice;
|
||||||
|
keep intact all notices stating that this License and any
|
||||||
|
non-permissive terms added in accord with section 7 apply to the code;
|
||||||
|
keep intact all notices of the absence of any warranty; and give all
|
||||||
|
recipients a copy of this License along with the Program.
|
||||||
|
|
||||||
|
You may charge any price or no price for each copy that you convey,
|
||||||
|
and you may offer support or warranty protection for a fee.
|
||||||
|
|
||||||
|
5. Conveying Modified Source Versions.
|
||||||
|
|
||||||
|
You may convey a work based on the Program, or the modifications to
|
||||||
|
produce it from the Program, in the form of source code under the
|
||||||
|
terms of section 4, provided that you also meet all of these conditions:
|
||||||
|
|
||||||
|
a) The work must carry prominent notices stating that you modified
|
||||||
|
it, and giving a relevant date.
|
||||||
|
|
||||||
|
b) The work must carry prominent notices stating that it is
|
||||||
|
released under this License and any conditions added under section
|
||||||
|
7. This requirement modifies the requirement in section 4 to
|
||||||
|
"keep intact all notices".
|
||||||
|
|
||||||
|
c) You must license the entire work, as a whole, under this
|
||||||
|
License to anyone who comes into possession of a copy. This
|
||||||
|
License will therefore apply, along with any applicable section 7
|
||||||
|
additional terms, to the whole of the work, and all its parts,
|
||||||
|
regardless of how they are packaged. This License gives no
|
||||||
|
permission to license the work in any other way, but it does not
|
||||||
|
invalidate such permission if you have separately received it.
|
||||||
|
|
||||||
|
d) If the work has interactive user interfaces, each must display
|
||||||
|
Appropriate Legal Notices; however, if the Program has interactive
|
||||||
|
interfaces that do not display Appropriate Legal Notices, your
|
||||||
|
work need not make them do so.
|
||||||
|
|
||||||
|
A compilation of a covered work with other separate and independent
|
||||||
|
works, which are not by their nature extensions of the covered work,
|
||||||
|
and which are not combined with it such as to form a larger program,
|
||||||
|
in or on a volume of a storage or distribution medium, is called an
|
||||||
|
"aggregate" if the compilation and its resulting copyright are not
|
||||||
|
used to limit the access or legal rights of the compilation's users
|
||||||
|
beyond what the individual works permit. Inclusion of a covered work
|
||||||
|
in an aggregate does not cause this License to apply to the other
|
||||||
|
parts of the aggregate.
|
||||||
|
|
||||||
|
6. Conveying Non-Source Forms.
|
||||||
|
|
||||||
|
You may convey a covered work in object code form under the terms
|
||||||
|
of sections 4 and 5, provided that you also convey the
|
||||||
|
machine-readable Corresponding Source under the terms of this License,
|
||||||
|
in one of these ways:
|
||||||
|
|
||||||
|
a) Convey the object code in, or embodied in, a physical product
|
||||||
|
(including a physical distribution medium), accompanied by the
|
||||||
|
Corresponding Source fixed on a durable physical medium
|
||||||
|
customarily used for software interchange.
|
||||||
|
|
||||||
|
b) Convey the object code in, or embodied in, a physical product
|
||||||
|
(including a physical distribution medium), accompanied by a
|
||||||
|
written offer, valid for at least three years and valid for as
|
||||||
|
long as you offer spare parts or customer support for that product
|
||||||
|
model, to give anyone who possesses the object code either (1) a
|
||||||
|
copy of the Corresponding Source for all the software in the
|
||||||
|
product that is covered by this License, on a durable physical
|
||||||
|
medium customarily used for software interchange, for a price no
|
||||||
|
more than your reasonable cost of physically performing this
|
||||||
|
conveying of source, or (2) access to copy the
|
||||||
|
Corresponding Source from a network server at no charge.
|
||||||
|
|
||||||
|
c) Convey individual copies of the object code with a copy of the
|
||||||
|
written offer to provide the Corresponding Source. This
|
||||||
|
alternative is allowed only occasionally and noncommercially, and
|
||||||
|
only if you received the object code with such an offer, in accord
|
||||||
|
with subsection 6b.
|
||||||
|
|
||||||
|
d) Convey the object code by offering access from a designated
|
||||||
|
place (gratis or for a charge), and offer equivalent access to the
|
||||||
|
Corresponding Source in the same way through the same place at no
|
||||||
|
further charge. You need not require recipients to copy the
|
||||||
|
Corresponding Source along with the object code. If the place to
|
||||||
|
copy the object code is a network server, the Corresponding Source
|
||||||
|
may be on a different server (operated by you or a third party)
|
||||||
|
that supports equivalent copying facilities, provided you maintain
|
||||||
|
clear directions next to the object code saying where to find the
|
||||||
|
Corresponding Source. Regardless of what server hosts the
|
||||||
|
Corresponding Source, you remain obligated to ensure that it is
|
||||||
|
available for as long as needed to satisfy these requirements.
|
||||||
|
|
||||||
|
e) Convey the object code using peer-to-peer transmission, provided
|
||||||
|
you inform other peers where the object code and Corresponding
|
||||||
|
Source of the work are being offered to the general public at no
|
||||||
|
charge under subsection 6d.
|
||||||
|
|
||||||
|
A separable portion of the object code, whose source code is excluded
|
||||||
|
from the Corresponding Source as a System Library, need not be
|
||||||
|
included in conveying the object code work.
|
||||||
|
|
||||||
|
A "User Product" is either (1) a "consumer product", which means any
|
||||||
|
tangible personal property which is normally used for personal, family,
|
||||||
|
or household purposes, or (2) anything designed or sold for incorporation
|
||||||
|
into a dwelling. In determining whether a product is a consumer product,
|
||||||
|
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||||
|
product received by a particular user, "normally used" refers to a
|
||||||
|
typical or common use of that class of product, regardless of the status
|
||||||
|
of the particular user or of the way in which the particular user
|
||||||
|
actually uses, or expects or is expected to use, the product. A product
|
||||||
|
is a consumer product regardless of whether the product has substantial
|
||||||
|
commercial, industrial or non-consumer uses, unless such uses represent
|
||||||
|
the only significant mode of use of the product.
|
||||||
|
|
||||||
|
"Installation Information" for a User Product means any methods,
|
||||||
|
procedures, authorization keys, or other information required to install
|
||||||
|
and execute modified versions of a covered work in that User Product from
|
||||||
|
a modified version of its Corresponding Source. The information must
|
||||||
|
suffice to ensure that the continued functioning of the modified object
|
||||||
|
code is in no case prevented or interfered with solely because
|
||||||
|
modification has been made.
|
||||||
|
|
||||||
|
If you convey an object code work under this section in, or with, or
|
||||||
|
specifically for use in, a User Product, and the conveying occurs as
|
||||||
|
part of a transaction in which the right of possession and use of the
|
||||||
|
User Product is transferred to the recipient in perpetuity or for a
|
||||||
|
fixed term (regardless of how the transaction is characterized), the
|
||||||
|
Corresponding Source conveyed under this section must be accompanied
|
||||||
|
by the Installation Information. But this requirement does not apply
|
||||||
|
if neither you nor any third party retains the ability to install
|
||||||
|
modified object code on the User Product (for example, the work has
|
||||||
|
been installed in ROM).
|
||||||
|
|
||||||
|
The requirement to provide Installation Information does not include a
|
||||||
|
requirement to continue to provide support service, warranty, or updates
|
||||||
|
for a work that has been modified or installed by the recipient, or for
|
||||||
|
the User Product in which it has been modified or installed. Access to a
|
||||||
|
network may be denied when the modification itself materially and
|
||||||
|
adversely affects the operation of the network or violates the rules and
|
||||||
|
protocols for communication across the network.
|
||||||
|
|
||||||
|
Corresponding Source conveyed, and Installation Information provided,
|
||||||
|
in accord with this section must be in a format that is publicly
|
||||||
|
documented (and with an implementation available to the public in
|
||||||
|
source code form), and must require no special password or key for
|
||||||
|
unpacking, reading or copying.
|
||||||
|
|
||||||
|
7. Additional Terms.
|
||||||
|
|
||||||
|
"Additional permissions" are terms that supplement the terms of this
|
||||||
|
License by making exceptions from one or more of its conditions.
|
||||||
|
Additional permissions that are applicable to the entire Program shall
|
||||||
|
be treated as though they were included in this License, to the extent
|
||||||
|
that they are valid under applicable law. If additional permissions
|
||||||
|
apply only to part of the Program, that part may be used separately
|
||||||
|
under those permissions, but the entire Program remains governed by
|
||||||
|
this License without regard to the additional permissions.
|
||||||
|
|
||||||
|
When you convey a copy of a covered work, you may at your option
|
||||||
|
remove any additional permissions from that copy, or from any part of
|
||||||
|
it. (Additional permissions may be written to require their own
|
||||||
|
removal in certain cases when you modify the work.) You may place
|
||||||
|
additional permissions on material, added by you to a covered work,
|
||||||
|
for which you have or can give appropriate copyright permission.
|
||||||
|
|
||||||
|
Notwithstanding any other provision of this License, for material you
|
||||||
|
add to a covered work, you may (if authorized by the copyright holders of
|
||||||
|
that material) supplement the terms of this License with terms:
|
||||||
|
|
||||||
|
a) Disclaiming warranty or limiting liability differently from the
|
||||||
|
terms of sections 15 and 16 of this License; or
|
||||||
|
|
||||||
|
b) Requiring preservation of specified reasonable legal notices or
|
||||||
|
author attributions in that material or in the Appropriate Legal
|
||||||
|
Notices displayed by works containing it; or
|
||||||
|
|
||||||
|
c) Prohibiting misrepresentation of the origin of that material, or
|
||||||
|
requiring that modified versions of such material be marked in
|
||||||
|
reasonable ways as different from the original version; or
|
||||||
|
|
||||||
|
d) Limiting the use for publicity purposes of names of licensors or
|
||||||
|
authors of the material; or
|
||||||
|
|
||||||
|
e) Declining to grant rights under trademark law for use of some
|
||||||
|
trade names, trademarks, or service marks; or
|
||||||
|
|
||||||
|
f) Requiring indemnification of licensors and authors of that
|
||||||
|
material by anyone who conveys the material (or modified versions of
|
||||||
|
it) with contractual assumptions of liability to the recipient, for
|
||||||
|
any liability that these contractual assumptions directly impose on
|
||||||
|
those licensors and authors.
|
||||||
|
|
||||||
|
All other non-permissive additional terms are considered "further
|
||||||
|
restrictions" within the meaning of section 10. If the Program as you
|
||||||
|
received it, or any part of it, contains a notice stating that it is
|
||||||
|
governed by this License along with a term that is a further
|
||||||
|
restriction, you may remove that term. If a license document contains
|
||||||
|
a further restriction but permits relicensing or conveying under this
|
||||||
|
License, you may add to a covered work material governed by the terms
|
||||||
|
of that license document, provided that the further restriction does
|
||||||
|
not survive such relicensing or conveying.
|
||||||
|
|
||||||
|
If you add terms to a covered work in accord with this section, you
|
||||||
|
must place, in the relevant source files, a statement of the
|
||||||
|
additional terms that apply to those files, or a notice indicating
|
||||||
|
where to find the applicable terms.
|
||||||
|
|
||||||
|
Additional terms, permissive or non-permissive, may be stated in the
|
||||||
|
form of a separately written license, or stated as exceptions;
|
||||||
|
the above requirements apply either way.
|
||||||
|
|
||||||
|
8. Termination.
|
||||||
|
|
||||||
|
You may not propagate or modify a covered work except as expressly
|
||||||
|
provided under this License. Any attempt otherwise to propagate or
|
||||||
|
modify it is void, and will automatically terminate your rights under
|
||||||
|
this License (including any patent licenses granted under the third
|
||||||
|
paragraph of section 11).
|
||||||
|
|
||||||
|
However, if you cease all violation of this License, then your
|
||||||
|
license from a particular copyright holder is reinstated (a)
|
||||||
|
provisionally, unless and until the copyright holder explicitly and
|
||||||
|
finally terminates your license, and (b) permanently, if the copyright
|
||||||
|
holder fails to notify you of the violation by some reasonable means
|
||||||
|
prior to 60 days after the cessation.
|
||||||
|
|
||||||
|
Moreover, your license from a particular copyright holder is
|
||||||
|
reinstated permanently if the copyright holder notifies you of the
|
||||||
|
violation by some reasonable means, this is the first time you have
|
||||||
|
received notice of violation of this License (for any work) from that
|
||||||
|
copyright holder, and you cure the violation prior to 30 days after
|
||||||
|
your receipt of the notice.
|
||||||
|
|
||||||
|
Termination of your rights under this section does not terminate the
|
||||||
|
licenses of parties who have received copies or rights from you under
|
||||||
|
this License. If your rights have been terminated and not permanently
|
||||||
|
reinstated, you do not qualify to receive new licenses for the same
|
||||||
|
material under section 10.
|
||||||
|
|
||||||
|
9. Acceptance Not Required for Having Copies.
|
||||||
|
|
||||||
|
You are not required to accept this License in order to receive or
|
||||||
|
run a copy of the Program. Ancillary propagation of a covered work
|
||||||
|
occurring solely as a consequence of using peer-to-peer transmission
|
||||||
|
to receive a copy likewise does not require acceptance. However,
|
||||||
|
nothing other than this License grants you permission to propagate or
|
||||||
|
modify any covered work. These actions infringe copyright if you do
|
||||||
|
not accept this License. Therefore, by modifying or propagating a
|
||||||
|
covered work, you indicate your acceptance of this License to do so.
|
||||||
|
|
||||||
|
10. Automatic Licensing of Downstream Recipients.
|
||||||
|
|
||||||
|
Each time you convey a covered work, the recipient automatically
|
||||||
|
receives a license from the original licensors, to run, modify and
|
||||||
|
propagate that work, subject to this License. You are not responsible
|
||||||
|
for enforcing compliance by third parties with this License.
|
||||||
|
|
||||||
|
An "entity transaction" is a transaction transferring control of an
|
||||||
|
organization, or substantially all assets of one, or subdividing an
|
||||||
|
organization, or merging organizations. If propagation of a covered
|
||||||
|
work results from an entity transaction, each party to that
|
||||||
|
transaction who receives a copy of the work also receives whatever
|
||||||
|
licenses to the work the party's predecessor in interest had or could
|
||||||
|
give under the previous paragraph, plus a right to possession of the
|
||||||
|
Corresponding Source of the work from the predecessor in interest, if
|
||||||
|
the predecessor has it or can get it with reasonable efforts.
|
||||||
|
|
||||||
|
You may not impose any further restrictions on the exercise of the
|
||||||
|
rights granted or affirmed under this License. For example, you may
|
||||||
|
not impose a license fee, royalty, or other charge for exercise of
|
||||||
|
rights granted under this License, and you may not initiate litigation
|
||||||
|
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||||
|
any patent claim is infringed by making, using, selling, offering for
|
||||||
|
sale, or importing the Program or any portion of it.
|
||||||
|
|
||||||
|
11. Patents.
|
||||||
|
|
||||||
|
A "contributor" is a copyright holder who authorizes use under this
|
||||||
|
License of the Program or a work on which the Program is based. The
|
||||||
|
work thus licensed is called the contributor's "contributor version".
|
||||||
|
|
||||||
|
A contributor's "essential patent claims" are all patent claims
|
||||||
|
owned or controlled by the contributor, whether already acquired or
|
||||||
|
hereafter acquired, that would be infringed by some manner, permitted
|
||||||
|
by this License, of making, using, or selling its contributor version,
|
||||||
|
but do not include claims that would be infringed only as a
|
||||||
|
consequence of further modification of the contributor version. For
|
||||||
|
purposes of this definition, "control" includes the right to grant
|
||||||
|
patent sublicenses in a manner consistent with the requirements of
|
||||||
|
this License.
|
||||||
|
|
||||||
|
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||||
|
patent license under the contributor's essential patent claims, to
|
||||||
|
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||||
|
propagate the contents of its contributor version.
|
||||||
|
|
||||||
|
In the following three paragraphs, a "patent license" is any express
|
||||||
|
agreement or commitment, however denominated, not to enforce a patent
|
||||||
|
(such as an express permission to practice a patent or covenant not to
|
||||||
|
sue for patent infringement). To "grant" such a patent license to a
|
||||||
|
party means to make such an agreement or commitment not to enforce a
|
||||||
|
patent against the party.
|
||||||
|
|
||||||
|
If you convey a covered work, knowingly relying on a patent license,
|
||||||
|
and the Corresponding Source of the work is not available for anyone
|
||||||
|
to copy, free of charge and under the terms of this License, through a
|
||||||
|
publicly available network server or other readily accessible means,
|
||||||
|
then you must either (1) cause the Corresponding Source to be so
|
||||||
|
available, or (2) arrange to deprive yourself of the benefit of the
|
||||||
|
patent license for this particular work, or (3) arrange, in a manner
|
||||||
|
consistent with the requirements of this License, to extend the patent
|
||||||
|
license to downstream recipients. "Knowingly relying" means you have
|
||||||
|
actual knowledge that, but for the patent license, your conveying the
|
||||||
|
covered work in a country, or your recipient's use of the covered work
|
||||||
|
in a country, would infringe one or more identifiable patents in that
|
||||||
|
country that you have reason to believe are valid.
|
||||||
|
|
||||||
|
If, pursuant to or in connection with a single transaction or
|
||||||
|
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||||
|
covered work, and grant a patent license to some of the parties
|
||||||
|
receiving the covered work authorizing them to use, propagate, modify
|
||||||
|
or convey a specific copy of the covered work, then the patent license
|
||||||
|
you grant is automatically extended to all recipients of the covered
|
||||||
|
work and works based on it.
|
||||||
|
|
||||||
|
A patent license is "discriminatory" if it does not include within
|
||||||
|
the scope of its coverage, prohibits the exercise of, or is
|
||||||
|
conditioned on the non-exercise of one or more of the rights that are
|
||||||
|
specifically granted under this License. You may not convey a covered
|
||||||
|
work if you are a party to an arrangement with a third party that is
|
||||||
|
in the business of distributing software, under which you make payment
|
||||||
|
to the third party based on the extent of your activity of conveying
|
||||||
|
the work, and under which the third party grants, to any of the
|
||||||
|
parties who would receive the covered work from you, a discriminatory
|
||||||
|
patent license (a) in connection with copies of the covered work
|
||||||
|
conveyed by you (or copies made from those copies), or (b) primarily
|
||||||
|
for and in connection with specific products or compilations that
|
||||||
|
contain the covered work, unless you entered into that arrangement,
|
||||||
|
or that patent license was granted, prior to 28 March 2007.
|
||||||
|
|
||||||
|
Nothing in this License shall be construed as excluding or limiting
|
||||||
|
any implied license or other defenses to infringement that may
|
||||||
|
otherwise be available to you under applicable patent law.
|
||||||
|
|
||||||
|
12. No Surrender of Others' Freedom.
|
||||||
|
|
||||||
|
If conditions are imposed on you (whether by court order, agreement or
|
||||||
|
otherwise) that contradict the conditions of this License, they do not
|
||||||
|
excuse you from the conditions of this License. If you cannot convey a
|
||||||
|
covered work so as to satisfy simultaneously your obligations under this
|
||||||
|
License and any other pertinent obligations, then as a consequence you may
|
||||||
|
not convey it at all. For example, if you agree to terms that obligate you
|
||||||
|
to collect a royalty for further conveying from those to whom you convey
|
||||||
|
the Program, the only way you could satisfy both those terms and this
|
||||||
|
License would be to refrain entirely from conveying the Program.
|
||||||
|
|
||||||
|
13. Use with the GNU Affero General Public License.
|
||||||
|
|
||||||
|
Notwithstanding any other provision of this License, you have
|
||||||
|
permission to link or combine any covered work with a work licensed
|
||||||
|
under version 3 of the GNU Affero General Public License into a single
|
||||||
|
combined work, and to convey the resulting work. The terms of this
|
||||||
|
License will continue to apply to the part which is the covered work,
|
||||||
|
but the special requirements of the GNU Affero General Public License,
|
||||||
|
section 13, concerning interaction through a network will apply to the
|
||||||
|
combination as such.
|
||||||
|
|
||||||
|
14. Revised Versions of this License.
|
||||||
|
|
||||||
|
The Free Software Foundation may publish revised and/or new versions of
|
||||||
|
the GNU General Public License from time to time. Such new versions will
|
||||||
|
be similar in spirit to the present version, but may differ in detail to
|
||||||
|
address new problems or concerns.
|
||||||
|
|
||||||
|
Each version is given a distinguishing version number. If the
|
||||||
|
Program specifies that a certain numbered version of the GNU General
|
||||||
|
Public License "or any later version" applies to it, you have the
|
||||||
|
option of following the terms and conditions either of that numbered
|
||||||
|
version or of any later version published by the Free Software
|
||||||
|
Foundation. If the Program does not specify a version number of the
|
||||||
|
GNU General Public License, you may choose any version ever published
|
||||||
|
by the Free Software Foundation.
|
||||||
|
|
||||||
|
If the Program specifies that a proxy can decide which future
|
||||||
|
versions of the GNU General Public License can be used, that proxy's
|
||||||
|
public statement of acceptance of a version permanently authorizes you
|
||||||
|
to choose that version for the Program.
|
||||||
|
|
||||||
|
Later license versions may give you additional or different
|
||||||
|
permissions. However, no additional obligations are imposed on any
|
||||||
|
author or copyright holder as a result of your choosing to follow a
|
||||||
|
later version.
|
||||||
|
|
||||||
|
15. Disclaimer of Warranty.
|
||||||
|
|
||||||
|
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||||
|
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||||
|
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
||||||
|
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||||
|
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||||
|
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||||
|
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||||
|
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||||
|
|
||||||
|
16. Limitation of Liability.
|
||||||
|
|
||||||
|
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||||
|
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||||
|
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||||
|
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||||
|
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||||
|
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||||
|
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||||
|
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||||
|
SUCH DAMAGES.
|
||||||
|
|
||||||
|
17. Interpretation of Sections 15 and 16.
|
||||||
|
|
||||||
|
If the disclaimer of warranty and limitation of liability provided
|
||||||
|
above cannot be given local legal effect according to their terms,
|
||||||
|
reviewing courts shall apply local law that most closely approximates
|
||||||
|
an absolute waiver of all civil liability in connection with the
|
||||||
|
Program, unless a warranty or assumption of liability accompanies a
|
||||||
|
copy of the Program in return for a fee.
|
||||||
|
|
||||||
|
END OF TERMS AND CONDITIONS
|
||||||
|
|
||||||
|
How to Apply These Terms to Your New Programs
|
||||||
|
|
||||||
|
If you develop a new program, and you want it to be of the greatest
|
||||||
|
possible use to the public, the best way to achieve this is to make it
|
||||||
|
free software which everyone can redistribute and change under these terms.
|
||||||
|
|
||||||
|
To do so, attach the following notices to the program. It is safest
|
||||||
|
to attach them to the start of each source file to most effectively
|
||||||
|
state the exclusion of warranty; and each file should have at least
|
||||||
|
the "copyright" line and a pointer to where the full notice is found.
|
||||||
|
|
||||||
|
<one line to give the program's name and a brief idea of what it does.>
|
||||||
|
Copyright (C) <year> <name of author>
|
||||||
|
|
||||||
|
This program is free software: you can redistribute it and/or modify
|
||||||
|
it under the terms of the GNU General Public License as published by
|
||||||
|
the Free Software Foundation, either version 3 of the License, or
|
||||||
|
(at your option) any later version.
|
||||||
|
|
||||||
|
This program is distributed in the hope that it will be useful,
|
||||||
|
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
GNU General Public License for more details.
|
||||||
|
|
||||||
|
You should have received a copy of the GNU General Public License
|
||||||
|
along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
Also add information on how to contact you by electronic and paper mail.
|
||||||
|
|
||||||
|
If the program does terminal interaction, make it output a short
|
||||||
|
notice like this when it starts in an interactive mode:
|
||||||
|
|
||||||
|
<program> Copyright (C) <year> <name of author>
|
||||||
|
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
|
||||||
|
This is free software, and you are welcome to redistribute it
|
||||||
|
under certain conditions; type `show c' for details.
|
||||||
|
|
||||||
|
The hypothetical commands `show w' and `show c' should show the appropriate
|
||||||
|
parts of the General Public License. Of course, your program's commands
|
||||||
|
might be different; for a GUI interface, you would use an "about box".
|
||||||
|
|
||||||
|
You should also get your employer (if you work as a programmer) or school,
|
||||||
|
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
||||||
|
For more information on this, and how to apply and follow the GNU GPL, see
|
||||||
|
<http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
The GNU General Public License does not permit incorporating your program
|
||||||
|
into proprietary programs. If your program is a subroutine library, you
|
||||||
|
may consider it more useful to permit linking proprietary applications with
|
||||||
|
the library. If this is what you want to do, use the GNU Lesser General
|
||||||
|
Public License instead of this License. But first, please read
|
||||||
|
<http://www.gnu.org/philosophy/why-not-lgpl.html>.
|
||||||
|
|
|
||||||
12
CREDITS
12
CREDITS
|
|
@ -1,9 +1,15 @@
|
||||||
Thanks go to the following people:
|
Thanks go to the following people (sorted by alphabet):
|
||||||
|
|
||||||
* the whole #cLinux channel
|
|
||||||
- for testing and debugging (those I mean should know ;-)
|
|
||||||
* Alexey Maximov
|
* Alexey Maximov
|
||||||
- for finding return-value and shell limitation bugs
|
- for finding return-value and shell limitation bugs
|
||||||
|
* #cLinux IRC channel on irc.freenode.org
|
||||||
|
- for testing and debugging (those I mean should know ;-)
|
||||||
|
* Daniel Aubry
|
||||||
|
- for reporting many hints
|
||||||
|
* Jens-Christoph Brendel
|
||||||
|
- Added automatic backup manager (contrib/jbrendel-autobackup)
|
||||||
|
* John Lawless
|
||||||
|
- A lot of patches and some very interesting discussions.
|
||||||
* Markus Meier
|
* Markus Meier
|
||||||
- for finding a really simple solution for choosing the right backup to
|
- for finding a really simple solution for choosing the right backup to
|
||||||
clone from: Make it independent of the interval, simply choose the last
|
clone from: Make it independent of the interval, simply choose the last
|
||||||
|
|
|
||||||
118
Makefile
118
Makefile
|
|
@ -22,14 +22,14 @@
|
||||||
#
|
#
|
||||||
|
|
||||||
INSTALL=install
|
INSTALL=install
|
||||||
CCOLLECT_SOURCE=ccollect.sh
|
CCOLLECT_SOURCE=ccollect
|
||||||
CCOLLECT_DEST=ccollect.sh
|
CCOLLECT_DEST=ccollect
|
||||||
LN=ln -sf
|
LN=ln -sf
|
||||||
ASCIIDOC=asciidoc
|
ASCIIDOC=asciidoc
|
||||||
DOCBOOKTOTEXI=docbook2x-texi
|
DOCBOOKTOTEXI=docbook2x-texi
|
||||||
DOCBOOKTOMAN=docbook2x-man
|
DOCBOOKTOMAN=docbook2x-man
|
||||||
XSLTPROC=xsltproc
|
XSLTPROC=xsltproc
|
||||||
XSL=/usr/share/xml/docbook/stylesheet/nwalsh/html/docbook.xsl
|
XSL=/usr/local/share/xsl/docbook/html/docbook.xsl
|
||||||
A2X=a2x
|
A2X=a2x
|
||||||
|
|
||||||
prefix=/usr/packages/ccollect-git
|
prefix=/usr/packages/ccollect-git
|
||||||
|
|
@ -41,11 +41,7 @@ manlink=/usr/local/man/man1
|
||||||
|
|
||||||
path_dir=/usr/local/bin
|
path_dir=/usr/local/bin
|
||||||
path_destination=${path_dir}/${CCOLLECT_DEST}
|
path_destination=${path_dir}/${CCOLLECT_DEST}
|
||||||
|
docs_archive_name=docs.tar
|
||||||
# where to publish
|
|
||||||
host=home.schottelius.org
|
|
||||||
dir=www/org/schottelius/unix/www/ccollect/
|
|
||||||
docdir=${dir}/doc
|
|
||||||
|
|
||||||
#
|
#
|
||||||
# Asciidoc will be used to generate other formats later
|
# Asciidoc will be used to generate other formats later
|
||||||
|
|
@ -57,7 +53,7 @@ MANDOCS = doc/man/ccollect.text \
|
||||||
doc/man/ccollect_logwrapper.text \
|
doc/man/ccollect_logwrapper.text \
|
||||||
doc/man/ccollect_list_intervals.text
|
doc/man/ccollect_list_intervals.text
|
||||||
|
|
||||||
DOCS = ${MANDOCS} doc/ccollect.text doc/ccollect-DE.text
|
DOCS = ${MANDOCS} doc/ccollect.text
|
||||||
|
|
||||||
#
|
#
|
||||||
# Doku
|
# Doku
|
||||||
|
|
@ -79,12 +75,7 @@ DOCBDOCS = ${DOCS:.text=.docbook}
|
||||||
|
|
||||||
DOC_ALL = ${HTMLDOCS} ${DBHTMLDOCS} ${TEXIDOCS} ${MANPDOCS} ${PDFDOCS}
|
DOC_ALL = ${HTMLDOCS} ${DBHTMLDOCS} ${TEXIDOCS} ${MANPDOCS} ${PDFDOCS}
|
||||||
|
|
||||||
html: ${HTMLDOCS}
|
TEST_LOG_FILE = /tmp/ccollect/ccollect.log
|
||||||
htm: ${DBHTMLDOCS}
|
|
||||||
info: ${TEXIDOCS}
|
|
||||||
man: ${MANPDOCS}
|
|
||||||
pdf: ${PDFDOCS}
|
|
||||||
documentation: ${DOC_ALL}
|
|
||||||
|
|
||||||
#
|
#
|
||||||
# End user targets
|
# End user targets
|
||||||
|
|
@ -96,6 +87,15 @@ all:
|
||||||
@echo "info: only generate Texinfo"
|
@echo "info: only generate Texinfo"
|
||||||
@echo "man: only generate manpage{s}"
|
@echo "man: only generate manpage{s}"
|
||||||
@echo "install: install ccollect to ${prefix}"
|
@echo "install: install ccollect to ${prefix}"
|
||||||
|
@echo "shellcheck: shellcheck ccollect script"
|
||||||
|
@echo "test: run unit tests"
|
||||||
|
|
||||||
|
html: ${HTMLDOCS}
|
||||||
|
htm: ${DBHTMLDOCS}
|
||||||
|
info: ${TEXIDOCS}
|
||||||
|
man: ${MANPDOCS}
|
||||||
|
pdf: ${PDFDOCS}
|
||||||
|
documentation: ${DOC_ALL}
|
||||||
|
|
||||||
install: install-link install-manlink
|
install: install-link install-manlink
|
||||||
|
|
||||||
|
|
@ -113,6 +113,32 @@ install-manlink: install-man
|
||||||
${INSTALL} -d -m 0755 ${manlink}
|
${INSTALL} -d -m 0755 ${manlink}
|
||||||
for man in ${mandest}/*; do ${LN} $$man ${manlink}; done
|
for man in ${mandest}/*; do ${LN} $$man ${manlink}; done
|
||||||
|
|
||||||
|
#
|
||||||
|
# Tools
|
||||||
|
#
|
||||||
|
TOOLS2=ccollect_add_source
|
||||||
|
TOOLS2 += ccollect_analyse_logs
|
||||||
|
|
||||||
|
TOOLS=ccollect_add_source \
|
||||||
|
ccollect_analyse_logs \
|
||||||
|
ccollect_delete_source \
|
||||||
|
ccollect_list_intervals \
|
||||||
|
ccollect_logwrapper \
|
||||||
|
ccollect_list_intervals
|
||||||
|
|
||||||
|
# Stick to posix
|
||||||
|
TOOLSMAN1 = $(TOOLS:ccollect=doc/man/ccollect)
|
||||||
|
TOOLSMAN = $(TOOLSMAN1:=.text)
|
||||||
|
|
||||||
|
TOOLSFP = $(subst ccollect,tools/ccollect,$(TOOLS))
|
||||||
|
|
||||||
|
## FIXME: posix make: shell? =>
|
||||||
|
|
||||||
|
t2:
|
||||||
|
echo $(TOOLS) - $(TOOLSFP)
|
||||||
|
echo $(TOOLSMAN)
|
||||||
|
echo $(TOOLSFP)
|
||||||
|
|
||||||
|
|
||||||
# docbook gets .htm, asciidoc directly .html
|
# docbook gets .htm, asciidoc directly .html
|
||||||
%.htm: %.docbook
|
%.htm: %.docbook
|
||||||
|
|
@ -147,13 +173,13 @@ install-manlink: install-man
|
||||||
#
|
#
|
||||||
# Developer targets
|
# Developer targets
|
||||||
#
|
#
|
||||||
update:
|
pub:
|
||||||
@git push
|
git push
|
||||||
|
|
||||||
publish-doc: documentation
|
publish-doc: documentation
|
||||||
@echo "Transferring files to ${host}"
|
|
||||||
@chmod a+r ${DOCS} ${DOC_ALL}
|
@chmod a+r ${DOCS} ${DOC_ALL}
|
||||||
@tar c ${DOCS} ${DOC_ALL} | ssh ${host} "cd ${dir}; tar xv"
|
@tar cf ${docs_archive_name} ${DOCS} ${DOC_ALL}
|
||||||
|
@echo "Documentation files are in ${docs_archive_name}"
|
||||||
|
|
||||||
#
|
#
|
||||||
# Distribution
|
# Distribution
|
||||||
|
|
@ -170,5 +196,55 @@ distclean: clean
|
||||||
#
|
#
|
||||||
dist: distclean documentation
|
dist: distclean documentation
|
||||||
|
|
||||||
test: ccollect.sh documentation
|
/tmp/ccollect:
|
||||||
CCOLLECT_CONF=./conf ./ccollect.sh daily "source with spaces"
|
mkdir -p /tmp/ccollect
|
||||||
|
|
||||||
|
shellcheck: ./ccollect
|
||||||
|
shellcheck -s sh -f gcc -x ./ccollect
|
||||||
|
|
||||||
|
test-nico: $(CCOLLECT_SOURCE) /tmp/ccollect
|
||||||
|
cd ./conf/sources/; for s in *; do CCOLLECT_CONF=../ ../../ccollect daily "$$s"; done
|
||||||
|
touch /tmp/ccollect/$$(ls /tmp/ccollect | head -n1).ccollect-marker
|
||||||
|
CCOLLECT_CONF=./conf ./ccollect -a daily
|
||||||
|
touch /tmp/ccollect/$$(ls /tmp/ccollect | head -n1).ccollect-marker
|
||||||
|
CCOLLECT_CONF=./conf ./ccollect -a -p daily
|
||||||
|
|
||||||
|
test-dir-source:
|
||||||
|
mkdir -p /tmp/ccollect/source
|
||||||
|
cp -R -f ./* /tmp/ccollect/source
|
||||||
|
|
||||||
|
test-dir-destination:
|
||||||
|
mkdir -p /tmp/ccollect/backup
|
||||||
|
|
||||||
|
test-dir-destination-chint:
|
||||||
|
mkdir -p /tmp/ccollect/backup-chint
|
||||||
|
|
||||||
|
test-fixed-intervals: $(CCOLLECT_SOURCE) test-dir-source test-dir-destination test-dir-destination-chint
|
||||||
|
for s in ./test/conf/sources/*; do \
|
||||||
|
CCOLLECT_CONF=./test/conf ./ccollect -l ${TEST_LOG_FILE} daily "$$(basename "$$s")"; \
|
||||||
|
test "$$(ls -1 /tmp/ccollect/backup | wc -l)" -gt "0" || { cat ${TEST_LOG_FILE}; exit 1; }; \
|
||||||
|
done
|
||||||
|
CCOLLECT_CONF=./test/conf ./ccollect -l ${TEST_LOG_FILE} -a -v daily
|
||||||
|
test "$$(ls -1 /tmp/ccollect/backup | wc -l)" -gt "0" || { cat ${TEST_LOG_FILE}; exit 1; }
|
||||||
|
CCOLLECT_CONF=./test/conf ./ccollect -l ${TEST_LOG_FILE} -a -p daily
|
||||||
|
test "$$(ls -1 /tmp/ccollect/backup | wc -l)" -gt "0" || { cat ${TEST_LOG_FILE}; exit 1; }
|
||||||
|
@printf "\nFixed intervals test ended successfully\n"
|
||||||
|
|
||||||
|
test-interval-changing: $(CCOLLECT_SOURCE) test-dir-source test-dir-destination-chint
|
||||||
|
rm -rf /tmp/ccollect/backup-chint/*
|
||||||
|
test "$$(ls -1 /tmp/ccollect/backup-chint | wc -l)" -eq "0" || { cat ${TEST_LOG_FILE}; exit 1; }
|
||||||
|
printf "3" > ./test/conf/sources/local-with-interval/intervals/daily
|
||||||
|
for x in 1 2 3 4 5; do CCOLLECT_CONF=./test/conf ./ccollect -l ${TEST_LOG_FILE} daily local-with-interval; done
|
||||||
|
test "$$(ls -1 /tmp/ccollect/backup-chint | wc -l)" -eq "3" || { cat ${TEST_LOG_FILE}; exit 1; }
|
||||||
|
printf "5" > ./test/conf/sources/local-with-interval/intervals/daily
|
||||||
|
for x in 1 2 3 4 5 6 7; do CCOLLECT_CONF=./test/conf ./ccollect -l ${TEST_LOG_FILE} daily local-with-interval; done
|
||||||
|
test "$$(ls -1 /tmp/ccollect/backup-chint | wc -l)" -eq "5" || { cat ${TEST_LOG_FILE}; exit 1; }
|
||||||
|
printf "4" > ./test/conf/sources/local-with-interval/intervals/daily
|
||||||
|
for x in 1 2 3 4 5 6; do CCOLLECT_CONF=./test/conf ./ccollect -l ${TEST_LOG_FILE} daily local-with-interval; done
|
||||||
|
test "$$(ls -1 /tmp/ccollect/backup-chint | wc -l)" -eq "4" || { cat ${TEST_LOG_FILE}; exit 1; }
|
||||||
|
printf "3" > ./test/conf/sources/local-with-interval/intervals/daily
|
||||||
|
@printf "\nInterval changing test ended successfully\n"
|
||||||
|
|
||||||
|
test: test-fixed-intervals test-interval-changing
|
||||||
|
test -f "${TEST_LOG_FILE}"
|
||||||
|
@printf "\nTests ended successfully\n"
|
||||||
|
|
|
||||||
46
README
46
README
|
|
@ -7,7 +7,7 @@ ccollect backups (local or remote) data to local or remote destinations.
|
||||||
You can retrieve the latest version of ccollect at [0].
|
You can retrieve the latest version of ccollect at [0].
|
||||||
|
|
||||||
ccollect was inspired by rsnapshot [1], which has some problems:
|
ccollect was inspired by rsnapshot [1], which has some problems:
|
||||||
- configuration parameters has to be TAB seperated
|
- configuration parameters have to be TAB seperated
|
||||||
- you can not specify per source exclude lists
|
- you can not specify per source exclude lists
|
||||||
- no per source pre/post execution support
|
- no per source pre/post execution support
|
||||||
- no parallel execution
|
- no parallel execution
|
||||||
|
|
@ -17,33 +17,38 @@ ccollect was inspired by rsnapshot [1], which has some problems:
|
||||||
Please use tools/report_success.sh to report success, if you are successfully
|
Please use tools/report_success.sh to report success, if you are successfully
|
||||||
using ccollect.
|
using ccollect.
|
||||||
|
|
||||||
|
Have a look at doc/HACKING, if you plan to change ccollect.
|
||||||
|
|
||||||
A small try to visualize the differences in a table:
|
A small try to visualize the differences in a table:
|
||||||
|
|
||||||
+---------------+-------------------------------------------------------------+
|
+---------------+-------------------------------------------------------------+
|
||||||
| What? | rsnapshot | ccollect |
|
| What? | rsnapshot | ccollect |
|
||||||
+---------------+-------------------------------------------------------------+
|
+---------------+-------------------------------------------------------------+
|
||||||
| Configuration | tab separated, needs | plain cconfig-style |
|
| Configuration | tab separated, needs | plain cconfig-style |
|
||||||
| | parsing | |
|
| | parsing | |
|
||||||
+---------------+-------------------------------------------------------------+
|
+---------------+-------------------------------------------------------------+
|
||||||
| Per source | | |
|
| Per source | | |
|
||||||
| post-/pre- | no | yes |
|
| post-/pre- | no | yes |
|
||||||
| execution | | |
|
| execution | | |
|
||||||
+---------------+-------------------------------------------------------------+
|
+---------------+-------------------------------------------------------------+
|
||||||
| Per source | | |
|
| Per source | | |
|
||||||
| exclude lists | no | yes |
|
| exclude lists | no | yes |
|
||||||
+---------------+-------------------------------------------------------------+
|
+---------------+-------------------------------------------------------------+
|
||||||
| Parallel | | |
|
| Parallel | | |
|
||||||
| execution | | |
|
| execution | | |
|
||||||
| of multiple | no | yes |
|
| of multiple | no | yes |
|
||||||
| backups | | |
|
| backups | | |
|
||||||
+---------------+-------------------------------------------------------------+
|
+---------------+-------------------------------------------------------------+
|
||||||
| Programming | perl | sh |
|
| Programming | perl | sh |
|
||||||
| language | | (posix compatible) |
|
| language | | (posix compatible) |
|
||||||
+---------------+-------------------------------------------------------------+
|
+---------------+-------------------------------------------------------------+
|
||||||
| Lines of code | 6772 (5353 w/o comments, | 546 (375 w/o comments, |
|
| Lines of code | 6772 (5353 w/o comments, | 546 (375 w/o comments, |
|
||||||
| (2006-10-25) | 4794 w/o empty lines) | 288 w/o empty lines) |
|
| (2006-10-25) | 4794 w/o empty lines) | 288 w/o empty lines) |
|
||||||
+---------------+-------------------------------------------------------------+
|
+---------------+-------------------------------------------------------------+
|
||||||
| Age | Available since 2002/2003 | Written at 2005-11-14 |
|
| Lines of code | 7269 (6778 w/o comments, | 587 (397 w/o comments, |
|
||||||
|
| (2009-07-23) | 6139 w/o empty lines) | 315 w/o empty lines) |
|
||||||
|
+---------------+-------------------------------------------------------------+
|
||||||
|
| Age | Available since 2002/2003 | Written at 2005-11-14 |
|
||||||
+---------------+-------------------------------------------------------------+
|
+---------------+-------------------------------------------------------------+
|
||||||
|
|
||||||
Included documentation:
|
Included documentation:
|
||||||
|
|
@ -51,13 +56,10 @@ Included documentation:
|
||||||
doc/ccollect.text Manual in text format
|
doc/ccollect.text Manual in text format
|
||||||
doc/ccollect.html Manual in xhtml (generated)
|
doc/ccollect.html Manual in xhtml (generated)
|
||||||
|
|
||||||
doc/ccollect-DE.text German manual in text format (externally maintained)
|
|
||||||
doc/ccollect-DE.html German manual in xhtml (generated)
|
|
||||||
|
|
||||||
doc/man/ccollect.text Manpage in text format
|
doc/man/ccollect.text Manpage in text format
|
||||||
doc/man/ccollect.man Manpage in manpage format (generated)
|
doc/man/ccollect.man Manpage in manpage format (generated)
|
||||||
|
|
||||||
--------------------------------------------------------------------------------
|
--------------------------------------------------------------------------------
|
||||||
[0]: ccollect: http://unix.schottelius.org/ccollect/
|
[0]: ccollect: http://www.nico.schottelius.org/software/ccollect/
|
||||||
[1]: rsnapshot: http://www.rsnapshot.org/
|
[1]: rsnapshot: http://www.rsnapshot.org/
|
||||||
[2]: cconfig: http://nico.schotteli.us/papers/linux/cconfig/
|
[2]: cconfig: http://nico.schotteli.us/papers/linux/cconfig/
|
||||||
|
|
|
||||||
26
TODO
Normal file
26
TODO
Normal file
|
|
@ -0,0 +1,26 @@
|
||||||
|
Hinweis:
|
||||||
|
Zwei Quellen auf ein Ziel funktioniert nicht, da die beiden
|
||||||
|
Quellen von den unterschiedlichen Verzeichnissen linken.
|
||||||
|
|
||||||
|
Listing:
|
||||||
|
.../* funktioniert nicht teilweise (abhängig von der Shell?)
|
||||||
|
|
||||||
|
Isup-check: optional!
|
||||||
|
|
||||||
|
line 318/check no_xxx => correct test?
|
||||||
|
|
||||||
|
REMOVE ALL PCMD code
|
||||||
|
|
||||||
|
Backup to remote can be done via ssh tunnel!
|
||||||
|
|
||||||
|
|
||||||
|
% remote host: Allow backup host to access our sshd
|
||||||
|
ssh -R4242:localhost:22 backupserver "ccollect interval backupremotehost"
|
||||||
|
|
||||||
|
remove $destination (==ddir)
|
||||||
|
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
|
||||||
|
Remote backups:
|
||||||
|
|
||||||
|
ccollect_backup_to $host $remote_port $source $interval
|
||||||
929
ccollect
Executable file
929
ccollect
Executable file
|
|
@ -0,0 +1,929 @@
|
||||||
|
#!/bin/sh
|
||||||
|
#
|
||||||
|
# 2005-2013 Nico Schottelius (nico-ccollect at schottelius.org)
|
||||||
|
# 2016-2019 Darko Poljak (darko.poljak at gmail.com)
|
||||||
|
#
|
||||||
|
# This file is part of ccollect.
|
||||||
|
#
|
||||||
|
# ccollect is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# ccollect is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with ccollect. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
#
|
||||||
|
# Initially written for SyGroup (www.sygroup.ch)
|
||||||
|
# Date: Mon Nov 14 11:45:11 CET 2005
|
||||||
|
|
||||||
|
# Error upon expanding unset variables:
|
||||||
|
set -u
|
||||||
|
|
||||||
|
#
|
||||||
|
# Standard variables (stolen from cconf)
|
||||||
|
#
|
||||||
|
__mydir="${0%/*}"
|
||||||
|
__abs_mydir="$(cd "$__mydir" && pwd -P)"
|
||||||
|
__myname=${0##*/}
|
||||||
|
|
||||||
|
#
|
||||||
|
# where to find our configuration and temporary file
|
||||||
|
#
|
||||||
|
CCOLLECT_CONF="${CCOLLECT_CONF:-/etc/ccollect}"
|
||||||
|
CSOURCES="${CCOLLECT_CONF}/sources"
|
||||||
|
CDEFAULTS="${CCOLLECT_CONF}/defaults"
|
||||||
|
CPREEXEC="${CDEFAULTS}/pre_exec"
|
||||||
|
CPOSTEXEC="${CDEFAULTS}/post_exec"
|
||||||
|
CMARKER=".ccollect-marker"
|
||||||
|
|
||||||
|
TMP="$(mktemp "/tmp/${__myname}.XXXXXX")"
|
||||||
|
export TMP
|
||||||
|
CONTROL_PIPE="/tmp/${__myname}-control-pipe"
|
||||||
|
|
||||||
|
VERSION="2.10"
|
||||||
|
RELEASE="2020-08-26"
|
||||||
|
HALF_VERSION="ccollect ${VERSION}"
|
||||||
|
FULL_VERSION="ccollect ${VERSION} (${RELEASE})"
|
||||||
|
|
||||||
|
#
|
||||||
|
# CDATE: how we use it for naming of the archives
|
||||||
|
# DDATE: how the user should see it in our output (DISPLAY)
|
||||||
|
#
|
||||||
|
CDATE="date +%Y%m%d-%H%M"
|
||||||
|
DDATE="date +%Y-%m-%d-%H:%M:%S"
|
||||||
|
SDATE="date +%s"
|
||||||
|
|
||||||
|
#
|
||||||
|
# LOCKING: use flock if available, otherwise mkdir
|
||||||
|
# Locking is done for each source so that only one instance per source
|
||||||
|
# can run.
|
||||||
|
#
|
||||||
|
# Use CCOLLECT_CONF directory for lock files.
|
||||||
|
# This directory can be set arbitrary so it is writable for user
|
||||||
|
# executing ccollect.
|
||||||
|
LOCKDIR="${CCOLLECT_CONF}"
|
||||||
|
# printf pattern: ccollect_<source>.lock
|
||||||
|
LOCKFILE_PATTERN="ccollect_%s.lock"
|
||||||
|
LOCKFD=4
|
||||||
|
|
||||||
|
#
|
||||||
|
# locking functions using flock
|
||||||
|
#
|
||||||
|
lock_flock()
|
||||||
|
{
|
||||||
|
# $1 = source to backup
|
||||||
|
# shellcheck disable=SC2059
|
||||||
|
lockfile="${LOCKDIR}/$(printf "${LOCKFILE_PATTERN}" "$1")"
|
||||||
|
eval "exec ${LOCKFD}> '${lockfile}'"
|
||||||
|
|
||||||
|
flock -n ${LOCKFD} && return 0 || return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
unlock_flock()
|
||||||
|
{
|
||||||
|
# $1 = source to backup
|
||||||
|
# shellcheck disable=SC2059
|
||||||
|
lockfile="${LOCKDIR}/$(printf "${LOCKFILE_PATTERN}" "$1")"
|
||||||
|
eval "exec ${LOCKFD}>&-"
|
||||||
|
rm -f "${lockfile}"
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# locking functions using mkdir (mkdir is atomic)
|
||||||
|
#
|
||||||
|
lock_mkdir()
|
||||||
|
{
|
||||||
|
# $1 = source to backup
|
||||||
|
# shellcheck disable=SC2059
|
||||||
|
lockfile="${LOCKDIR}/$(printf "${LOCKFILE_PATTERN}" "$1")"
|
||||||
|
|
||||||
|
mkdir "${lockfile}" && return 0 || return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
unlock_mkdir()
|
||||||
|
{
|
||||||
|
# $1 = source to backup
|
||||||
|
# shellcheck disable=SC2059
|
||||||
|
lockfile="${LOCKDIR}/$(printf "${LOCKFILE_PATTERN}" "$1")"
|
||||||
|
|
||||||
|
rmdir "${lockfile}"
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# determine locking tool: flock or mkdir
|
||||||
|
#
|
||||||
|
if command -v flock > /dev/null 2>&1
|
||||||
|
then
|
||||||
|
lockf="lock_flock"
|
||||||
|
unlockf="unlock_flock"
|
||||||
|
else
|
||||||
|
lockf="lock_mkdir"
|
||||||
|
unlockf="unlock_mkdir"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# unset values
|
||||||
|
#
|
||||||
|
PARALLEL=""
|
||||||
|
MAX_JOBS=""
|
||||||
|
USE_ALL=""
|
||||||
|
LOGFILE=""
|
||||||
|
SYSLOG=""
|
||||||
|
# e - only errors, a - all output
|
||||||
|
LOGLEVEL="a"
|
||||||
|
LOGONLYERRORS=""
|
||||||
|
|
||||||
|
#
|
||||||
|
# catch signals
|
||||||
|
#
|
||||||
|
TRAPFUNC="rm -f \"${TMP}\""
|
||||||
|
# shellcheck disable=SC2064
|
||||||
|
trap "${TRAPFUNC}" 1 2 15
|
||||||
|
|
||||||
|
#
|
||||||
|
# Functions
|
||||||
|
#
|
||||||
|
|
||||||
|
# check if we are running interactive or non-interactive
|
||||||
|
# see: http://www.tldp.org/LDP/abs/html/intandnonint.html
|
||||||
|
_is_interactive()
|
||||||
|
{
|
||||||
|
[ -t 0 ] || [ -p /dev/stdin ]
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# ssh-"feature": we cannot do '... read ...; ssh ...; < file',
|
||||||
|
# because ssh reads stdin! -n does not work -> does not ask for password
|
||||||
|
# Also allow deletion for files without the given suffix
|
||||||
|
#
|
||||||
|
delete_from_file()
|
||||||
|
{
|
||||||
|
file="$1"; shift
|
||||||
|
suffix="" # It will be set, if deleting incomplete backups.
|
||||||
|
[ $# -eq 1 ] && suffix="$1" && shift
|
||||||
|
# dirs for deletion will be moved to this trash dir inside destination dir
|
||||||
|
# - for fast mv operation
|
||||||
|
trash="$(mktemp -d ".trash.XXXXXX")"
|
||||||
|
while read -r to_remove; do
|
||||||
|
mv "${to_remove}" "${trash}" ||
|
||||||
|
_exit_err "Moving ${to_remove} to ${trash} failed."
|
||||||
|
set -- "$@" "${to_remove}"
|
||||||
|
if [ "${suffix}" ]; then
|
||||||
|
to_remove_no_suffix="$(echo "${to_remove}" | sed "s/$suffix\$//")"
|
||||||
|
mv "${to_remove_no_suffix}" "${trash}" ||
|
||||||
|
_exit_err "Moving ${to_remove_no_suffix} to ${trash} failed."
|
||||||
|
set -- "$@" "${to_remove_no_suffix}"
|
||||||
|
fi
|
||||||
|
done < "${file}"
|
||||||
|
_techo "Removing $* in ${trash}..."
|
||||||
|
empty_dir=".empty-dir"
|
||||||
|
mkdir "${empty_dir}" || _exit_err "Empty directory ${empty_dir} cannot be created."
|
||||||
|
[ "${VVERBOSE}" ] && echo "Starting: rsync -a --delete ${empty_dir} ${trash}"
|
||||||
|
# rsync needs ending slash for directory content
|
||||||
|
rsync -a --delete "${empty_dir}/" "${trash}/" || _exit_err "Removing $* failed."
|
||||||
|
rmdir "${trash}" || _exit_err "Removing ${trash} directory failed"
|
||||||
|
rmdir "${empty_dir}" || _exit_err "Removing ${empty_dir} directory failed"
|
||||||
|
_techo "Removing $* in ${trash} finished."
|
||||||
|
}
|
||||||
|
|
||||||
|
display_version()
|
||||||
|
{
|
||||||
|
echo "${FULL_VERSION}"
|
||||||
|
exit 0
|
||||||
|
}
|
||||||
|
|
||||||
|
usage()
|
||||||
|
{
|
||||||
|
cat << eof
|
||||||
|
${__myname}: [args] <interval name> <sources to backup>
|
||||||
|
|
||||||
|
ccollect creates (pseudo) incremental backups
|
||||||
|
|
||||||
|
-h, --help: Show this help screen
|
||||||
|
-a, --all: Backup all sources specified in ${CSOURCES}
|
||||||
|
-e, --errors: Log only errors
|
||||||
|
-j [max], --jobs [max] Specifies the number of jobs to run simultaneously.
|
||||||
|
If max is not specified then parallelise all jobs.
|
||||||
|
-l FILE, --logfile FILE Log to specified file
|
||||||
|
-p, --parallel: Parallelise backup processes (deprecated from 2.0)
|
||||||
|
-s, --syslog: Log to syslog with tag ccollect
|
||||||
|
-v, --verbose: Be very verbose (uses set -x)
|
||||||
|
-V, --version: Print version information
|
||||||
|
|
||||||
|
This is version ${VERSION} released on ${RELEASE}.
|
||||||
|
|
||||||
|
Retrieve latest ccollect at http://www.nico.schottelius.org/software/ccollect/
|
||||||
|
eof
|
||||||
|
exit 0
|
||||||
|
}
|
||||||
|
|
||||||
|
# locking functions
|
||||||
|
lock()
|
||||||
|
{
|
||||||
|
"${lockf}" "$@" || _exit_err \
|
||||||
|
"Only one instance of ${__myname} for source \"$1\" can run at one time."
|
||||||
|
}
|
||||||
|
|
||||||
|
unlock()
|
||||||
|
{
|
||||||
|
"${unlockf}" "$@"
|
||||||
|
}
|
||||||
|
|
||||||
|
# time displaying echo
|
||||||
|
# stdout version
|
||||||
|
_techo_stdout()
|
||||||
|
{
|
||||||
|
echo "$(${DDATE}): $*"
|
||||||
|
}
|
||||||
|
|
||||||
|
# syslog version
|
||||||
|
_techo_syslog()
|
||||||
|
{
|
||||||
|
logger -t ccollect "$@"
|
||||||
|
}
|
||||||
|
|
||||||
|
# specified file version
|
||||||
|
_techo_file()
|
||||||
|
{
|
||||||
|
_techo_stdout "$@" >> "${LOGFILE}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# determine _techo version before parsing options
|
||||||
|
if _is_interactive
|
||||||
|
then
|
||||||
|
_techof="_techo_stdout"
|
||||||
|
else
|
||||||
|
_techof="_techo_syslog"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# _techo with determined _techo version
|
||||||
|
_techo()
|
||||||
|
{
|
||||||
|
if [ "${LOGLEVEL}" = "a" ]
|
||||||
|
then
|
||||||
|
# name is exported before calling this function
|
||||||
|
# shellcheck disable=SC2154
|
||||||
|
set -- ${name:+"[${name}]"} "$@"
|
||||||
|
"${_techof}" "$@"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
_techo_err()
|
||||||
|
{
|
||||||
|
_techo "Error: $*"
|
||||||
|
}
|
||||||
|
|
||||||
|
_exit_err()
|
||||||
|
{
|
||||||
|
_techo_err "$@"
|
||||||
|
rm -f "${TMP}"
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# Parse options
|
||||||
|
#
|
||||||
|
while [ "$#" -ge 1 ]; do
|
||||||
|
case "$1" in
|
||||||
|
-a|--all)
|
||||||
|
USE_ALL=1
|
||||||
|
;;
|
||||||
|
-p|--parallel)
|
||||||
|
_techo "Warning: -p, --parallel option is deprecated," \
|
||||||
|
"use -j, --jobs instead."
|
||||||
|
PARALLEL=1
|
||||||
|
MAX_JOBS=""
|
||||||
|
;;
|
||||||
|
-j|--jobs)
|
||||||
|
PARALLEL=1
|
||||||
|
if [ "$#" -ge 2 ]
|
||||||
|
then
|
||||||
|
case "$2" in
|
||||||
|
-*)
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
MAX_JOBS=$2
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
-e|--errors)
|
||||||
|
LOGONLYERRORS="1"
|
||||||
|
;;
|
||||||
|
-l|--logfile)
|
||||||
|
if [ "$#" -ge 2 ]
|
||||||
|
then
|
||||||
|
case "$2" in
|
||||||
|
-*)
|
||||||
|
_exit_err "Missing log file"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
LOGFILE="$2"
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
else
|
||||||
|
_exit_err "Missing log file"
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
-s|--syslog)
|
||||||
|
SYSLOG="1"
|
||||||
|
;;
|
||||||
|
-v|--verbose)
|
||||||
|
set -x
|
||||||
|
;;
|
||||||
|
-V|--version)
|
||||||
|
display_version
|
||||||
|
;;
|
||||||
|
--)
|
||||||
|
# ignore the -- itself
|
||||||
|
shift
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
-h|--help|-*)
|
||||||
|
usage
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
shift
|
||||||
|
done
|
||||||
|
|
||||||
|
# determine _techo version and logging level after parsing options
|
||||||
|
if [ "${LOGFILE}" ]
|
||||||
|
then
|
||||||
|
_techof="_techo_file"
|
||||||
|
LOGLEVEL="a"
|
||||||
|
elif _is_interactive
|
||||||
|
then
|
||||||
|
if [ "${SYSLOG}" ]
|
||||||
|
then
|
||||||
|
_techof="_techo_syslog"
|
||||||
|
LOGLEVEL="a"
|
||||||
|
else
|
||||||
|
_techof="_techo_stdout"
|
||||||
|
LOGLEVEL="e"
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
_techof="_techo_syslog"
|
||||||
|
LOGLEVEL="a"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ "${LOGFILE}" ] || [ "${SYSLOG}" ]
|
||||||
|
then
|
||||||
|
if [ "${LOGONLYERRORS}" ]
|
||||||
|
then
|
||||||
|
LOGLEVEL="e"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# check that MAX_JOBS is natural number > 0
|
||||||
|
# empty string means run all in parallel
|
||||||
|
if ! echo "${MAX_JOBS}" | grep -q -E '^[1-9][0-9]*$|^$'
|
||||||
|
then
|
||||||
|
_exit_err "Invalid max jobs value \"${MAX_JOBS}\""
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Setup interval
|
||||||
|
#
|
||||||
|
if [ $# -ge 1 ]; then
|
||||||
|
export INTERVAL="$1"
|
||||||
|
shift
|
||||||
|
else
|
||||||
|
usage
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check for configuraton directory
|
||||||
|
#
|
||||||
|
[ -d "${CCOLLECT_CONF}" ] || _exit_err "No configuration found in " \
|
||||||
|
"\"${CCOLLECT_CONF}\" (is \$CCOLLECT_CONF properly set?)"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Create (portable!) source "array"
|
||||||
|
#
|
||||||
|
export no_sources=0
|
||||||
|
|
||||||
|
if [ "${USE_ALL}" = 1 ]; then
|
||||||
|
#
|
||||||
|
# Get sources from source configuration
|
||||||
|
#
|
||||||
|
( cd "${CSOURCES}" && ls -1 > "${TMP}" ) || \
|
||||||
|
_exit_err "Listing of sources failed. Aborting."
|
||||||
|
|
||||||
|
while read -r tmp; do
|
||||||
|
eval export "source_${no_sources}=\"${tmp}\""
|
||||||
|
no_sources=$((no_sources + 1))
|
||||||
|
done < "${TMP}"
|
||||||
|
else
|
||||||
|
#
|
||||||
|
# Get sources from command line
|
||||||
|
#
|
||||||
|
while [ "$#" -ge 1 ]; do
|
||||||
|
eval "arg=\"\$1\""
|
||||||
|
shift
|
||||||
|
|
||||||
|
# arg is assigned in the eval above
|
||||||
|
# shellcheck disable=SC2154
|
||||||
|
eval export "source_${no_sources}=\"${arg}\""
|
||||||
|
no_sources="$((no_sources + 1))"
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Need at least ONE source to backup
|
||||||
|
#
|
||||||
|
if [ "${no_sources}" -lt 1 ]; then
|
||||||
|
usage
|
||||||
|
else
|
||||||
|
_techo "${HALF_VERSION}: Beginning backup using interval ${INTERVAL}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Look for pre-exec command (general)
|
||||||
|
#
|
||||||
|
if [ -x "${CPREEXEC}" ]; then
|
||||||
|
_techo "Executing ${CPREEXEC} ..."
|
||||||
|
"${CPREEXEC}"; ret=$?
|
||||||
|
_techo "Finished ${CPREEXEC} (return code: ${ret})."
|
||||||
|
|
||||||
|
[ "${ret}" -eq 0 ] || _exit_err "${CPREEXEC} failed. Aborting"
|
||||||
|
fi
|
||||||
|
|
||||||
|
################################################################################
|
||||||
|
#
|
||||||
|
# Let's do the backup - here begins the real stuff
|
||||||
|
#
|
||||||
|
|
||||||
|
# in PARALLEL mode:
|
||||||
|
# * create control pipe
|
||||||
|
# * determine number of jobs to start at once
|
||||||
|
if [ "${PARALLEL}" ]; then
|
||||||
|
mkfifo "${CONTROL_PIPE}"
|
||||||
|
# fd 5 is tied to control pipe
|
||||||
|
eval "exec 5<>'${CONTROL_PIPE}'"
|
||||||
|
TRAPFUNC="${TRAPFUNC}; rm -f \"${CONTROL_PIPE}\""
|
||||||
|
# shellcheck disable=SC2064
|
||||||
|
trap "${TRAPFUNC}" 0 1 2 15
|
||||||
|
|
||||||
|
# determine how much parallel jobs to prestart
|
||||||
|
if [ "${MAX_JOBS}" ]
|
||||||
|
then
|
||||||
|
if [ "${MAX_JOBS}" -le "${no_sources}" ]
|
||||||
|
then
|
||||||
|
prestart="${MAX_JOBS}"
|
||||||
|
else
|
||||||
|
prestart="${no_sources}"
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
prestart=0
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
source_no=0
|
||||||
|
while [ "${source_no}" -lt "${no_sources}" ]; do
|
||||||
|
#
|
||||||
|
# Get current source
|
||||||
|
#
|
||||||
|
eval export name=\"\$source_${source_no}\"
|
||||||
|
source_no=$((source_no + 1))
|
||||||
|
|
||||||
|
#
|
||||||
|
# Start ourself, if we want parallel execution
|
||||||
|
#
|
||||||
|
if [ "${PARALLEL}" ]; then
|
||||||
|
if [ ! "${MAX_JOBS}" ]
|
||||||
|
then
|
||||||
|
# run all in parallel
|
||||||
|
"$0" "${INTERVAL}" "${name}" &
|
||||||
|
continue
|
||||||
|
elif [ "${prestart}" -gt 0 ]
|
||||||
|
then
|
||||||
|
# run prestart child if pending
|
||||||
|
{ "$0" "${INTERVAL}" "${name}"; printf '\n' >&5; } &
|
||||||
|
prestart=$((prestart - 1))
|
||||||
|
continue
|
||||||
|
else
|
||||||
|
# each time a child finishes we get a line from the pipe
|
||||||
|
# and then launch another child
|
||||||
|
while read -r line
|
||||||
|
do
|
||||||
|
{ "$0" "${INTERVAL}" "${name}"; printf '\n' >&5; } &
|
||||||
|
# get out of loop so we can contnue with main loop
|
||||||
|
# for next source
|
||||||
|
break
|
||||||
|
done <&5
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Start subshell for easy log editing
|
||||||
|
#
|
||||||
|
(
|
||||||
|
backup="${CSOURCES}/${name}"
|
||||||
|
c_source="${backup}/source"
|
||||||
|
c_dest="${backup}/destination"
|
||||||
|
c_pre_exec="${backup}/pre_exec"
|
||||||
|
c_post_exec="${backup}/post_exec"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Stderr to stdout, so we can produce nice logs
|
||||||
|
#
|
||||||
|
exec 2>&1
|
||||||
|
|
||||||
|
#
|
||||||
|
# Record start of backup: internal and for the user
|
||||||
|
#
|
||||||
|
begin_s="$(${SDATE})"
|
||||||
|
_techo "Beginning to backup"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Standard configuration checks
|
||||||
|
#
|
||||||
|
if [ ! -e "${backup}" ]; then
|
||||||
|
_exit_err "Source \"${backup}\" does not exist."
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Configuration _must_ be a directory (cconfig style)
|
||||||
|
#
|
||||||
|
if [ ! -d "${backup}" ]; then
|
||||||
|
_exit_err "\"${backup}\" is not a cconfig-directory. Skipping."
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Acquire lock for source. If lock cannot be acquired, lock will exit
|
||||||
|
# with error message.
|
||||||
|
#
|
||||||
|
lock "${name}"
|
||||||
|
|
||||||
|
# redefine trap to also unlock (rm lockfile)
|
||||||
|
TRAPFUNC="${TRAPFUNC}; unlock \"${name}\""
|
||||||
|
# shellcheck disable=SC2064
|
||||||
|
trap "${TRAPFUNC}" 1 2 15
|
||||||
|
|
||||||
|
#
|
||||||
|
# First execute pre_exec, which may generate destination or other parameters
|
||||||
|
#
|
||||||
|
if [ -x "${c_pre_exec}" ]; then
|
||||||
|
_techo "Executing ${c_pre_exec} ..."
|
||||||
|
"${c_pre_exec}"; ret="$?"
|
||||||
|
_techo "Finished ${c_pre_exec} (return code ${ret})."
|
||||||
|
|
||||||
|
[ "${ret}" -eq 0 ] || _exit_err "${c_pre_exec} failed. Skipping."
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Read source configuration
|
||||||
|
#
|
||||||
|
for opt in verbose very_verbose summary exclude rsync_options \
|
||||||
|
delete_incomplete rsync_failure_codes \
|
||||||
|
mtime quiet_if_down ; do
|
||||||
|
if [ -f "${backup}/${opt}" ] || [ -f "${backup}/no_${opt}" ]; then
|
||||||
|
eval "c_$opt=\"${backup}/$opt\""
|
||||||
|
else
|
||||||
|
eval "c_$opt=\"${CDEFAULTS}/$opt\""
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
#
|
||||||
|
# Interval definition: First try source specific, fallback to default
|
||||||
|
#
|
||||||
|
c_interval="$(cat "${backup}/intervals/${INTERVAL}" 2>/dev/null)"
|
||||||
|
|
||||||
|
if [ -z "${c_interval}" ]; then
|
||||||
|
c_interval="$(cat "${CDEFAULTS}/intervals/${INTERVAL}" 2>/dev/null)"
|
||||||
|
|
||||||
|
if [ -z "${c_interval}" ]; then
|
||||||
|
_exit_err "No definition for interval \"${INTERVAL}\" found. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Sort by ctime (default) or mtime (configuration option)
|
||||||
|
#
|
||||||
|
# variable is assigned using eval
|
||||||
|
# shellcheck disable=SC2154
|
||||||
|
if [ -f "${c_mtime}" ] ; then
|
||||||
|
TSORT="t"
|
||||||
|
else
|
||||||
|
TSORT="tc"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Source configuration checks
|
||||||
|
#
|
||||||
|
if [ ! -f "${c_source}" ]; then
|
||||||
|
_exit_err "Source description \"${c_source}\" is not a file. Skipping."
|
||||||
|
else
|
||||||
|
source=$(cat "${c_source}"); ret="$?"
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_exit_err "Source ${c_source} is not readable. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Destination is a path
|
||||||
|
#
|
||||||
|
if [ ! -f "${c_dest}" ]; then
|
||||||
|
_exit_err "Destination ${c_dest} is not a file. Skipping."
|
||||||
|
else
|
||||||
|
ddir="$(cat "${c_dest}")"; ret="$?"
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_exit_err "Destination ${c_dest} is not readable. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Parameters: ccollect defaults, configuration options, user options
|
||||||
|
#
|
||||||
|
|
||||||
|
#
|
||||||
|
# Rsync standard options (archive will be added after is-up-check)
|
||||||
|
#
|
||||||
|
set -- "$@" "--delete" "--numeric-ids" "--relative" \
|
||||||
|
"--delete-excluded" "--sparse"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Exclude list
|
||||||
|
#
|
||||||
|
# variable is assigned using eval
|
||||||
|
# shellcheck disable=SC2154
|
||||||
|
if [ -f "${c_exclude}" ]; then
|
||||||
|
set -- "$@" "--exclude-from=${c_exclude}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Output a summary
|
||||||
|
#
|
||||||
|
# variable is assigned using eval
|
||||||
|
# shellcheck disable=SC2154
|
||||||
|
if [ -f "${c_summary}" ]; then
|
||||||
|
set -- "$@" "--stats"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Verbosity for rsync, rm, and mkdir
|
||||||
|
#
|
||||||
|
VVERBOSE=""
|
||||||
|
# variable is assigned using eval
|
||||||
|
# shellcheck disable=SC2154
|
||||||
|
if [ -f "${c_very_verbose}" ]; then
|
||||||
|
set -- "$@" "-vv"
|
||||||
|
VVERBOSE="-v"
|
||||||
|
elif [ -f "${c_verbose}" ]; then
|
||||||
|
set -- "$@" "-v"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Extra options for rsync provided by the user
|
||||||
|
#
|
||||||
|
# variable is assigned using eval
|
||||||
|
# shellcheck disable=SC2154
|
||||||
|
if [ -f "${c_rsync_options}" ]; then
|
||||||
|
while read -r line; do
|
||||||
|
# Trim line.
|
||||||
|
ln=$(echo "${line}" | awk '{$1=$1;print;}')
|
||||||
|
# Only if ln is non zero length string.
|
||||||
|
#
|
||||||
|
# If ln is empty then rsync '' DEST evaluates
|
||||||
|
# to transfer current directory to DEST which would
|
||||||
|
# with specific options destroy DEST content.
|
||||||
|
if [ -n "${ln}" ]
|
||||||
|
then
|
||||||
|
set -- "$@" "${ln}"
|
||||||
|
fi
|
||||||
|
done < "${c_rsync_options}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check: source is up and accepting connections (before deleting old backups!)
|
||||||
|
#
|
||||||
|
if ! rsync "$@" "${source}" >/dev/null 2>"${TMP}" ; then
|
||||||
|
# variable is assigned using eval
|
||||||
|
# shellcheck disable=SC2154
|
||||||
|
if [ ! -f "${c_quiet_if_down}" ]; then
|
||||||
|
cat "${TMP}"
|
||||||
|
fi
|
||||||
|
_exit_err "Source ${source} is not readable. Skipping."
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Add --archive for real backup (looks nice in front)
|
||||||
|
#
|
||||||
|
set -- "--archive" "$@"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check: destination exists?
|
||||||
|
#
|
||||||
|
cd "${ddir}" || _exit_err "Cannot change to ${ddir}. Skipping."
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check incomplete backups (needs echo to remove newlines)
|
||||||
|
#
|
||||||
|
# shellcheck disable=SC2010
|
||||||
|
ls -1 | grep "${CMARKER}\$" > "${TMP}"; ret=$?
|
||||||
|
|
||||||
|
if [ "$ret" -eq 0 ]; then
|
||||||
|
_techo "Incomplete backups: $(cat "${TMP}")"
|
||||||
|
# variable is assigned using eval
|
||||||
|
# shellcheck disable=SC2154
|
||||||
|
if [ -f "${c_delete_incomplete}" ]; then
|
||||||
|
delete_from_file "${TMP}" "${CMARKER}" &
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Include current time in name, not the time when we began to remove above
|
||||||
|
#
|
||||||
|
destination_name="${INTERVAL}.$(${CDATE}).$$-${source_no}"
|
||||||
|
export destination_name
|
||||||
|
destination_dir="${ddir}/${destination_name}"
|
||||||
|
export destination_dir
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check: maximum number of backups is reached?
|
||||||
|
#
|
||||||
|
# shellcheck disable=SC2010
|
||||||
|
count="$(ls -1 | grep -c "^${INTERVAL}\\.")"
|
||||||
|
|
||||||
|
_techo "Existing backups: ${count} Total keeping backups: ${c_interval}"
|
||||||
|
|
||||||
|
if [ "${count}" -ge "${c_interval}" ]; then
|
||||||
|
# Use oldest directory as new backup destination directory.
|
||||||
|
# It need not to be deleted, rsync will sync its content.
|
||||||
|
# shellcheck disable=SC2010
|
||||||
|
oldest_bak=$(ls -${TSORT}1r | grep "^${INTERVAL}\\." | head -n 1 || \
|
||||||
|
_exit_err "Listing oldest backup failed")
|
||||||
|
_techo "Using ${oldest_bak} for destination dir ${destination_dir}"
|
||||||
|
if mv "${oldest_bak}" "${destination_dir}"; then
|
||||||
|
# Touch dest dir so it is not sorted wrong in listings below.
|
||||||
|
ls_rm_exclude=$(basename "${destination_dir}")
|
||||||
|
|
||||||
|
# We have something to remove only if count > interval.
|
||||||
|
remove="$((count - c_interval))"
|
||||||
|
else
|
||||||
|
_techo_err "Renaming oldest backup ${oldest_bak} to ${destination_dir} failed, removing it."
|
||||||
|
remove="$((count - c_interval + 1))"
|
||||||
|
ls_rm_exclude=""
|
||||||
|
fi
|
||||||
|
if [ "${remove}" -gt 0 ]; then
|
||||||
|
_techo "Removing ${remove} backup(s)..."
|
||||||
|
|
||||||
|
if [ -z "${ls_rm_exclude}" ]; then
|
||||||
|
# shellcheck disable=SC2010
|
||||||
|
ls -${TSORT}1r | grep "^${INTERVAL}\\." | head -n "${remove}" > "${TMP}" || \
|
||||||
|
_exit_err "Listing old backups failed"
|
||||||
|
else
|
||||||
|
# shellcheck disable=SC2010
|
||||||
|
ls -${TSORT}1r | grep -v "${ls_rm_exclude}" | grep "^${INTERVAL}\\." | head -n "${remove}" > "${TMP}" || \
|
||||||
|
_exit_err "Listing old backups failed"
|
||||||
|
fi
|
||||||
|
|
||||||
|
delete_from_file "${TMP}" &
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check for backup directory to clone from: Always clone from the latest one!
|
||||||
|
# Exclude destination_dir from listing, it can be touched reused and renamed
|
||||||
|
# oldest existing destination directory.
|
||||||
|
#
|
||||||
|
dest_dir_name=$(basename "${destination_dir}")
|
||||||
|
# shellcheck disable=SC2010
|
||||||
|
last_dir="$(ls -${TSORT}p1 | grep '/$' | grep -v "${dest_dir_name}" | head -n 1)" || \
|
||||||
|
_exit_err "Failed to list contents of ${ddir}."
|
||||||
|
|
||||||
|
#
|
||||||
|
# Clone from old backup, if existing
|
||||||
|
#
|
||||||
|
if [ "${last_dir}" ]; then
|
||||||
|
set -- "$@" "--link-dest=${ddir}/${last_dir}"
|
||||||
|
_techo "Hard linking from ${last_dir}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Mark backup running and go back to original directory
|
||||||
|
#
|
||||||
|
touch "${destination_dir}${CMARKER}"
|
||||||
|
cd "${__abs_mydir}" || _exit_err "Cannot go back to ${__abs_mydir}."
|
||||||
|
|
||||||
|
#
|
||||||
|
# the rsync part
|
||||||
|
#
|
||||||
|
_techo "Transferring files..."
|
||||||
|
rsync "$@" "${source}" "${destination_dir}"; ret=$?
|
||||||
|
_techo "Finished backup (rsync return code: $ret)."
|
||||||
|
|
||||||
|
#
|
||||||
|
# export rsync return code, might be useful in post_exec
|
||||||
|
#
|
||||||
|
export rsync_return_code=$ret
|
||||||
|
|
||||||
|
#
|
||||||
|
# Set modification time (mtime) to current time, if sorting by mtime is enabled
|
||||||
|
#
|
||||||
|
[ -f "$c_mtime" ] && touch "${destination_dir}"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check if rsync exit code indicates failure.
|
||||||
|
#
|
||||||
|
fail=""
|
||||||
|
# variable is assigned using eval
|
||||||
|
# shellcheck disable=SC2154
|
||||||
|
if [ -f "$c_rsync_failure_codes" ]; then
|
||||||
|
while read -r code ; do
|
||||||
|
if [ "$ret" = "$code" ]; then
|
||||||
|
fail=1
|
||||||
|
fi
|
||||||
|
done <"${c_rsync_failure_codes}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Remove marking here unless rsync failed.
|
||||||
|
#
|
||||||
|
if [ -z "$fail" ]; then
|
||||||
|
rm "${destination_dir}${CMARKER}" || \
|
||||||
|
_exit_err "Removing ${destination_dir}${CMARKER} failed."
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_techo "Warning: rsync exited non-zero, the backup may be broken (see rsync errors)."
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
_techo "Warning: rsync failed with return code $ret."
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Create symlink to newest backup
|
||||||
|
#
|
||||||
|
# shellcheck disable=SC2010
|
||||||
|
latest_dir="$(ls -${TSORT}p1 "${ddir}" | grep '/$' | head -n 1)" || \
|
||||||
|
_exit_err "Failed to list content of ${ddir}."
|
||||||
|
|
||||||
|
ln -snf "${ddir}${latest_dir}" "${ddir}current" || \
|
||||||
|
_exit_err "Failed to create 'current' symlink."
|
||||||
|
|
||||||
|
#
|
||||||
|
# post_exec
|
||||||
|
#
|
||||||
|
if [ -x "${c_post_exec}" ]; then
|
||||||
|
_techo "Executing ${c_post_exec} ..."
|
||||||
|
"${c_post_exec}"; ret=$?
|
||||||
|
_techo "Finished ${c_post_exec}."
|
||||||
|
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_exit_err "${c_post_exec} failed."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Time calculation
|
||||||
|
#
|
||||||
|
end_s="$(${SDATE})"
|
||||||
|
full_seconds="$((end_s - begin_s))"
|
||||||
|
hours="$((full_seconds / 3600))"
|
||||||
|
minutes="$(((full_seconds % 3600) / 60))"
|
||||||
|
seconds="$((full_seconds % 60))"
|
||||||
|
|
||||||
|
_techo "Backup lasted: ${hours}:${minutes}:${seconds} (h:m:s)"
|
||||||
|
|
||||||
|
unlock "${name}"
|
||||||
|
|
||||||
|
# wait for children (doing delete_from_file) if any still running
|
||||||
|
wait
|
||||||
|
) || exit
|
||||||
|
done
|
||||||
|
|
||||||
|
#
|
||||||
|
# Be a good parent and wait for our children, if they are running wild parallel
|
||||||
|
# After all children are finished then remove control pipe.
|
||||||
|
#
|
||||||
|
if [ "${PARALLEL}" ]; then
|
||||||
|
_techo "Waiting for children to complete..."
|
||||||
|
wait
|
||||||
|
rm -f "${CONTROL_PIPE}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Look for post-exec command (general)
|
||||||
|
#
|
||||||
|
if [ -x "${CPOSTEXEC}" ]; then
|
||||||
|
_techo "Executing ${CPOSTEXEC} ..."
|
||||||
|
"${CPOSTEXEC}"; ret=$?
|
||||||
|
_techo "Finished ${CPOSTEXEC} (return code: ${ret})."
|
||||||
|
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_techo "${CPOSTEXEC} failed."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
rm -f "${TMP}"
|
||||||
|
_techo "Finished"
|
||||||
1
conf/defaults/intervals/normal
Normal file
1
conf/defaults/intervals/normal
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
25
|
||||||
|
|
@ -1 +0,0 @@
|
||||||
/home/nico/backupdir
|
|
||||||
|
|
@ -1 +0,0 @@
|
||||||
/home/nico/vpn
|
|
||||||
1
conf/sources/delete_incomplete/source
Normal file
1
conf/sources/delete_incomplete/source
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
/home/users/nico/bin
|
||||||
1
conf/sources/from-remote/README
Normal file
1
conf/sources/from-remote/README
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
This is based on a production example I use for my notebook.
|
||||||
1
conf/sources/from-remote/exclude
Normal file
1
conf/sources/from-remote/exclude
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
/home/server/raid
|
||||||
1
conf/sources/from-remote/source
Normal file
1
conf/sources/from-remote/source
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
localhost:/home/users/nico/bin
|
||||||
0
conf/sources/fehler → conf/sources/from-remote/summary
Executable file → Normal file
0
conf/sources/fehler → conf/sources/from-remote/summary
Executable file → Normal file
1
conf/sources/local-with&ersand/destination
Normal file
1
conf/sources/local-with&ersand/destination
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
/tmp/ccollect
|
||||||
1
conf/sources/local-with&ersand/source
Normal file
1
conf/sources/local-with&ersand/source
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
/home/users/nico/bin
|
||||||
1
conf/sources/local/destination
Normal file
1
conf/sources/local/destination
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
/tmp/ccollect
|
||||||
1
conf/sources/local/exclude
Normal file
1
conf/sources/local/exclude
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
.git
|
||||||
0
conf/sources/local/no_verbose
Normal file
0
conf/sources/local/no_verbose
Normal file
1
conf/sources/local/source
Normal file
1
conf/sources/local/source
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
/home/users/nico/bin
|
||||||
|
|
@ -1 +0,0 @@
|
||||||
/home/user/nico/bin
|
|
||||||
|
|
@ -1 +0,0 @@
|
||||||
home.schottelius.org
|
|
||||||
|
|
@ -1 +0,0 @@
|
||||||
/home/user/nico/bin
|
|
||||||
1
conf/sources/source with spaces and interval/destination
Normal file
1
conf/sources/source with spaces and interval/destination
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
/tmp/ccollect
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
30
|
||||||
1
conf/sources/source with spaces and interval/source
Normal file
1
conf/sources/source with spaces and interval/source
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
/home/users/nico/bin
|
||||||
0
conf/sources/source with spaces and interval/verbose
Normal file
0
conf/sources/source with spaces and interval/verbose
Normal file
|
|
@ -1 +0,0 @@
|
||||||
/home/user/nico/backupdir/testsource1
|
|
||||||
|
|
@ -1 +0,0 @@
|
||||||
/home/user/nico/oeffentlich/computer/projekte/ccollect
|
|
||||||
1
conf/sources/source-without-destination/exclude
Normal file
1
conf/sources/source-without-destination/exclude
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
.git
|
||||||
1
conf/sources/source-without-destination/source
Normal file
1
conf/sources/source-without-destination/source
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
/home/users/nico/bin
|
||||||
|
|
@ -1 +0,0 @@
|
||||||
manage
|
|
||||||
|
|
@ -1 +0,0 @@
|
||||||
nico@creme.schottelius.org:bin
|
|
||||||
|
|
@ -1 +0,0 @@
|
||||||
/home/nico/backupdir
|
|
||||||
|
|
@ -1,3 +0,0 @@
|
||||||
openvpn-2.0.1.tar.gz
|
|
||||||
nicht_reinnehmen
|
|
||||||
etwas mit leerzeichenli
|
|
||||||
|
|
@ -1 +0,0 @@
|
||||||
20
|
|
||||||
|
|
@ -1 +0,0 @@
|
||||||
/home/nico/vpn
|
|
||||||
0
conf/sources/this_is_not_a_source
Normal file
0
conf/sources/this_is_not_a_source
Normal file
1
conf/sources/very_verbose/README
Normal file
1
conf/sources/very_verbose/README
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
This is based on a production example I use for my notebook.
|
||||||
1
conf/sources/very_verbose/destination
Normal file
1
conf/sources/very_verbose/destination
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
/tmp/ccollect
|
||||||
1
conf/sources/very_verbose/exclude
Normal file
1
conf/sources/very_verbose/exclude
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
/home/server/raid
|
||||||
1
conf/sources/very_verbose/source
Normal file
1
conf/sources/very_verbose/source
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
/home/users/nico/bin
|
||||||
0
conf/sources/very_verbose/summary
Normal file
0
conf/sources/very_verbose/summary
Normal file
0
conf/sources/very_verbose/verbose
Normal file
0
conf/sources/very_verbose/verbose
Normal file
0
conf/sources/very_verbose/very_verbose
Normal file
0
conf/sources/very_verbose/very_verbose
Normal file
|
|
@ -1 +0,0 @@
|
||||||
/home/nico/backupdir/vpn
|
|
||||||
|
|
@ -1 +0,0 @@
|
||||||
/home/nico/vpn/
|
|
||||||
|
|
@ -1 +0,0 @@
|
||||||
/tmp
|
|
||||||
1
conf/sources/with_exec/destination
Normal file
1
conf/sources/with_exec/destination
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
/tmp/ccollect
|
||||||
|
|
@ -1 +1 @@
|
||||||
/bin
|
/home/users/nico/bin
|
||||||
|
|
|
||||||
3
contrib/README
Normal file
3
contrib/README
Normal file
|
|
@ -0,0 +1,3 @@
|
||||||
|
This directory contains patches or programs contributed by others
|
||||||
|
which are either not yet integrated into ccollect or may be kept
|
||||||
|
seperated generally.
|
||||||
79
contrib/ccollect.spec
Normal file
79
contrib/ccollect.spec
Normal file
|
|
@ -0,0 +1,79 @@
|
||||||
|
Summary: (pseudo) incremental backup with different exclude lists using hardlinks and rsync
|
||||||
|
Name: ccollect
|
||||||
|
Version: 2.3
|
||||||
|
Release: 0
|
||||||
|
URL: http://www.nico.schottelius.org/software/ccollect
|
||||||
|
Source0: http://www.nico.schottelius.org/software/ccollect/%{name}-%{version}.tar.bz2
|
||||||
|
|
||||||
|
License: GPL-3
|
||||||
|
Group: Applications/System
|
||||||
|
Vendor: Nico Schottelius <nico-ccollect@schottelius.org>
|
||||||
|
BuildRoot: %{_tmppath}/%{name}-%(id -un)
|
||||||
|
BuildArch: noarch
|
||||||
|
Requires: rsync
|
||||||
|
|
||||||
|
%description
|
||||||
|
Ccollect backups data from local and remote hosts to your local harddisk.
|
||||||
|
Although ccollect creates full backups, it requires very less space on the backup medium, because ccollect uses hardlinks to create an initial copy of the last backup.
|
||||||
|
Only the inodes used by the hardlinks and the changed files need additional space.
|
||||||
|
|
||||||
|
%prep
|
||||||
|
%setup -q
|
||||||
|
|
||||||
|
%install
|
||||||
|
rm -rf $RPM_BUILD_ROOT
|
||||||
|
|
||||||
|
#Installing main ccollect and /etc directory
|
||||||
|
%__install -d 755 %buildroot%_bindir
|
||||||
|
%__install -d 755 %buildroot%_sysconfdir/%name
|
||||||
|
%__install -m 755 ccollect %buildroot%_bindir/
|
||||||
|
|
||||||
|
#bin files from tools directory
|
||||||
|
for t in $(ls tools/ccollect_*) ; do
|
||||||
|
%__install -m 755 ${t} %buildroot%_bindir/
|
||||||
|
done
|
||||||
|
|
||||||
|
#Configuration examples and docs
|
||||||
|
%__install -d 755 %buildroot%_datadir/doc/%name-%version/examples
|
||||||
|
|
||||||
|
%__install -m 644 README %buildroot%_datadir/doc/%name-%version
|
||||||
|
%__install -m 644 COPYING %buildroot%_datadir/doc/%name-%version
|
||||||
|
%__install -m 644 CREDITS %buildroot%_datadir/doc/%name-%version
|
||||||
|
%__install -m 644 conf/README %buildroot%_datadir/doc/%name-%version/examples
|
||||||
|
%__cp -pr conf/defaults %buildroot%_datadir/doc/%name-%version/examples/
|
||||||
|
%__cp -pr conf/sources %buildroot%_datadir/doc/%name-%version/examples/
|
||||||
|
|
||||||
|
#Addition documentation and some config tools
|
||||||
|
%__install -d 755 %buildroot%_datadir/%name/tools
|
||||||
|
%__install -m 755 tools/called_from_remote_pre_exec %buildroot%_datadir/%name/tools
|
||||||
|
%__cp -pr tools/config-pre-* %buildroot%_datadir/%name/tools
|
||||||
|
%__install -m 755 tools/report_success %buildroot%_datadir/%name/tools
|
||||||
|
|
||||||
|
%clean
|
||||||
|
rm -rf $RPM_BUILD_ROOT
|
||||||
|
|
||||||
|
%files
|
||||||
|
%defattr(-,root,root)
|
||||||
|
%_bindir/ccollect*
|
||||||
|
%_datadir/doc/%name-%version
|
||||||
|
%_datadir/%name/tools
|
||||||
|
%docdir %_datadir/doc/%name-%version
|
||||||
|
%dir %_sysconfdir/%name
|
||||||
|
|
||||||
|
%changelog
|
||||||
|
* Thu Aug 20 2009 Nico Schottelius <nico-ccollect@schottelius.org> 0.8
|
||||||
|
- Introduce consistenst time sorting (John Lawless)
|
||||||
|
- Check for source connectivity before trying backup (John Lawless)
|
||||||
|
- Defensive programming patch (John Lawless)
|
||||||
|
- Some code cleanups (argument parsing, usage) (Nico Schottelius)
|
||||||
|
- Only consider directories as sources when using -a (Nico Schottelius)
|
||||||
|
- Fix general parsing problem with -a (Nico Schottelius)
|
||||||
|
- Fix potential bug when using remote_host, delete_incomplete and ssh (Nico Schottelius)
|
||||||
|
- Improve removal performance: minimised number of 'rm' calls (Nico Schottelius)
|
||||||
|
- Support sorting by mtime (John Lawless)
|
||||||
|
- Improve option handling (John Lawless)
|
||||||
|
- Add support for quiet operation for dead devices (quiet_if_down) (John Lawless)
|
||||||
|
- Add smart option parsing, including support for default values (John Lawless)
|
||||||
|
- Updated and cleaned up documentation (Nico Schottelius)
|
||||||
|
- Fixed bug "removal of current directory" in ccollect_delete_source.sh (Found by G????nter St????hr, fixed by Nico Schottelius)
|
||||||
|
|
||||||
47
contrib/ccollect_mgr/README
Normal file
47
contrib/ccollect_mgr/README
Normal file
|
|
@ -0,0 +1,47 @@
|
||||||
|
[Almost complete Copy of an e-mail from Patrick Drolet]
|
||||||
|
|
||||||
|
Hello again,
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
I have created a script to better manage the backups since my
|
||||||
|
upload/download ratio and my bandwidth is limited by my ISP, and my hard
|
||||||
|
disk space is also somewhat limited. The script is called
|
||||||
|
"ccollect_mgr.sh".
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
Provides the following features
|
||||||
|
|
||||||
|
1) Determine the interval (daily/weekly/monthly)
|
||||||
|
|
||||||
|
a. Define when you want weekly and monthly backups. It takes care of
|
||||||
|
the rest
|
||||||
|
|
||||||
|
2) Perform the backups using ccollect
|
||||||
|
|
||||||
|
3) Copy the ccollect log output to the first backup of the set
|
||||||
|
|
||||||
|
a. Keeping the detailed log of each backup is always handy!
|
||||||
|
|
||||||
|
4) Build a periodic report and include the real amount of disk used
|
||||||
|
|
||||||
|
a. Computes the real amount of disk used (eg: no double counting of
|
||||||
|
hard links)
|
||||||
|
|
||||||
|
b. Shows the actual amount of data transferred
|
||||||
|
|
||||||
|
5) Send an email if there has been errors or warnings
|
||||||
|
|
||||||
|
6) Send a periodic email to show transfer size, real backup size, etc
|
||||||
|
|
||||||
|
a. Weekly reports are nice.!
|
||||||
|
|
||||||
|
[...]
|
||||||
|
|
||||||
|
- rdu (real du), which computes the real amount of disk used (no
|
||||||
|
double/triple counting hard links), same code as in ccollect_mgr.sh.
|
||||||
|
|
||||||
|
- S60ccollect_example, an example script to put in etc/init.d to
|
||||||
|
add ccollect_mgr to the crontab
|
||||||
|
|
||||||
21
contrib/ccollect_mgr/S60ccollect_example
Normal file
21
contrib/ccollect_mgr/S60ccollect_example
Normal file
|
|
@ -0,0 +1,21 @@
|
||||||
|
#!/bin/sh
|
||||||
|
|
||||||
|
# Standard Linux: put in /etc/init.d
|
||||||
|
# Busybox: put in /opt/etc/init.d
|
||||||
|
|
||||||
|
# Add ccollect_mgr job to crontab
|
||||||
|
# Syntax reminder from crontab:
|
||||||
|
# minute 0-59
|
||||||
|
# hour 0-23
|
||||||
|
# day of month 1-31
|
||||||
|
# month 1-12 (or names, see below)
|
||||||
|
# day of week 0-7 (0 or 7 is Sun, or use names)
|
||||||
|
|
||||||
|
crontab -l | grep -v ccollect_mgr > /tmp/crontab.tmp
|
||||||
|
|
||||||
|
# Backup every day at 1 am.
|
||||||
|
echo "00 01 * * * /usr/local/sbin/ccollect_mgr.sh -from nas@myemail.net -to me@myemail.net -server relay_or_smtp_server NAS > /usr/local/var/log/ccollect.cron &" >> /tmp/crontab.tmp
|
||||||
|
|
||||||
|
crontab /tmp/crontab.tmp
|
||||||
|
rm /tmp/crontab.tmp
|
||||||
|
|
||||||
542
contrib/ccollect_mgr/ccollect_mgr.sh
Normal file
542
contrib/ccollect_mgr/ccollect_mgr.sh
Normal file
|
|
@ -0,0 +1,542 @@
|
||||||
|
#!/bin/sh
|
||||||
|
#
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Last update: 2009-12-11
|
||||||
|
# By : pdrolet (ccollect_mgr@drolet.name)
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Job manager to the ccollect utilities
|
||||||
|
# (ccollect written by Nico Schottelius)
|
||||||
|
#
|
||||||
|
# Provides the following features
|
||||||
|
# 1) Determine the interval (daily/weekly/monthly)
|
||||||
|
# 2) Check the estimated file transfer size
|
||||||
|
# 3) Perform the backups using ccollect
|
||||||
|
# 4) Copy the ccollect log to the first backup of the set
|
||||||
|
# 5) Build a periodic report and include the real amount of disk used
|
||||||
|
# 6) Send an email if there has been errors or warnings
|
||||||
|
# 7) Send a periodic email to show transfer size, real backup size, etc
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
#
|
||||||
|
# This script was written primarily to gain better visibility of backups in
|
||||||
|
# an environment where data transfer is limited and so is bandwidth
|
||||||
|
# (eg: going through an ISP). The primary target of this script were a
|
||||||
|
# DNS323 and a QNAP T209 (eg: Busybox devices and not standard Linux devices)
|
||||||
|
# but it should run on any POSIX compliant device.
|
||||||
|
#
|
||||||
|
# Note: This is one of my first script in over a decade... don't use this as a
|
||||||
|
# reference (but take a look at ccollect.sh... very well written!)
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
#
|
||||||
|
# -------------------------------------------
|
||||||
|
# TO MAKE THIS SCRIPT RUN ON A BUSYBOX DEVICE
|
||||||
|
# -------------------------------------------
|
||||||
|
# - You may need to install Optware and the following packages:
|
||||||
|
# - findutils (to get a find utility which supports printf)
|
||||||
|
# - procps (to get a ps utility that is standard)
|
||||||
|
# - mini-sendmail (this is what I used to send emails... you could easily
|
||||||
|
# modify this to use sendmail, mutt, putmail, etc...).
|
||||||
|
# - On DNS323 only: Your Busybox is very limited. For details, see
|
||||||
|
# http://wiki.dns323.info/howto:ffp#shells. You need to redirect /bin/sh
|
||||||
|
# to the Busybox provided with ffp (Fun Plug). To do this, type:
|
||||||
|
# ln -fs /ffp/bin/sh /bin/sh
|
||||||
|
#
|
||||||
|
# --------------------------------------------------
|
||||||
|
# TO MAKE THIS SCRIPT RUN ON A STANDARD LINUX DEVICE
|
||||||
|
# --------------------------------------------------
|
||||||
|
# - You will need install mini_sendmail or rewrite the send_email routine.
|
||||||
|
#
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
# Send warning if the worst case data transfer will be larger than (in MB)...
|
||||||
|
warning_transfer_size=1024
|
||||||
|
abort_transfer_size=5120
|
||||||
|
|
||||||
|
# Define paths and default file names
|
||||||
|
ADD_TO_PATH="/opt/bin:/opt/sbin:/usr/local/bin:/usr/local/sbin"
|
||||||
|
CCOLLECT="ccollect.sh"
|
||||||
|
CCOLLECT_CONF="/usr/local/etc/ccollect"
|
||||||
|
|
||||||
|
PS="/opt/bin/ps"
|
||||||
|
FIND="/opt/bin/find"
|
||||||
|
|
||||||
|
TEMP_LOG="${CCOLLECT_CONF}"/log.$$
|
||||||
|
per_report="${CCOLLECT_CONF}/periodic_report.log"
|
||||||
|
tmp_report="/tmp/ccollect.$$"
|
||||||
|
tmp_mgr="/tmp/ccollect_mgr.$$"
|
||||||
|
tmp_email="/tmp/email.$$"
|
||||||
|
|
||||||
|
backups_not_found=""
|
||||||
|
|
||||||
|
# Sub routines...
|
||||||
|
|
||||||
|
send_email()
|
||||||
|
{
|
||||||
|
# Send a simple email using mini-sendmail.
|
||||||
|
|
||||||
|
msg_body_file="$1"
|
||||||
|
shift
|
||||||
|
|
||||||
|
# ------------------------------
|
||||||
|
# Quit if we can't send an email
|
||||||
|
# ------------------------------
|
||||||
|
if [ "${to}" == "" ] || [ "${mail_server}" == "" ]; then
|
||||||
|
echo "Missing mail server or destination email. No email sent with subject: $@"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo from: "${from}" > "${tmp_email}"
|
||||||
|
echo subject: "$@" >> "${tmp_email}"
|
||||||
|
echo to: "${to}" >> "${tmp_email}"
|
||||||
|
echo cc: >> "${tmp_email}"
|
||||||
|
echo bcc: >> "${tmp_email}"
|
||||||
|
echo "" >> "${tmp_email}"
|
||||||
|
echo "" >> "${tmp_email}"
|
||||||
|
cat "${msg_body_file}" >> "${tmp_email}"
|
||||||
|
echo "" >> "${tmp_email}"
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo Sending email to ${to} to report the following:
|
||||||
|
echo -----------------------------------------------
|
||||||
|
cat "${tmp_email}"
|
||||||
|
cat "${tmp_email}" | mini_sendmail -f"${from}" -s"${mail_server}" "${to}"
|
||||||
|
rm "${tmp_email}"
|
||||||
|
}
|
||||||
|
|
||||||
|
remove_source()
|
||||||
|
{
|
||||||
|
remove_no=$1
|
||||||
|
eval echo Removing backup \"\$source_$1\"
|
||||||
|
|
||||||
|
no_sources="$(( ${no_sources} - 1 ))"
|
||||||
|
while [ "${remove_no}" -lt "${no_sources}" ]; do
|
||||||
|
eval source_${remove_no}=\"\$source_$(( ${remove_no} + 1))\"
|
||||||
|
eval ddir_${remove_no}=\"\$ddir_$(( ${remove_no} + 1))\"
|
||||||
|
remove_no=$(( ${remove_no} + 1 ))
|
||||||
|
done
|
||||||
|
}
|
||||||
|
|
||||||
|
compute_rdu()
|
||||||
|
{
|
||||||
|
kdivider=1
|
||||||
|
find_options=""
|
||||||
|
|
||||||
|
while [ "$#" -ge 1 ]; do
|
||||||
|
case "$1" in
|
||||||
|
-m)
|
||||||
|
kdivider=1024
|
||||||
|
;;
|
||||||
|
-g)
|
||||||
|
kdivider=1048576
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
shift
|
||||||
|
done
|
||||||
|
|
||||||
|
if [ "$#" == 0 ]; then
|
||||||
|
rdu=0
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------------------------------------------
|
||||||
|
# Compute the real disk usage (eg: hard links do files outside the backup set don't count)
|
||||||
|
# ------------------------------------------------------------------------------------------------------
|
||||||
|
# 1) Find selected files and list link count, inodes, file type and size
|
||||||
|
# 2) Sort (sorts on inodes since link count is constant per inode)
|
||||||
|
# 3) Merge duplicates using uniq
|
||||||
|
# (result is occurence count, link count, inode, file type and size)
|
||||||
|
# 4) Use awk to sum up the file size of each inodes when the occurence count
|
||||||
|
# and link count are the same. Use %k for size since awk's printf is 32 bits
|
||||||
|
# 5) Present the result with additional dividers based on command line parameters
|
||||||
|
#
|
||||||
|
|
||||||
|
rdu=$(( ( `"${FIND}" "$@" -printf '%n %i %y %k \n' \
|
||||||
|
| sort -n \
|
||||||
|
| uniq -c \
|
||||||
|
| awk '{ if (( $1 == $2 ) || ($4 == "d")) { sum += $5; } } END { printf "%u\n",(sum); }'` \
|
||||||
|
+ ${kdivider} - 1 ) / ${kdivider} ))
|
||||||
|
}
|
||||||
|
|
||||||
|
check_running_backups()
|
||||||
|
{
|
||||||
|
# Check if a backup is already ongoing. If so, skip and send email
|
||||||
|
# Don't use the ccollect marker as this is no indication if it is still running
|
||||||
|
|
||||||
|
source_no=0
|
||||||
|
while [ "${source_no}" -lt "${no_sources}" ]; do
|
||||||
|
eval backup=\"\$source_${source_no}\"
|
||||||
|
|
||||||
|
PID=$$
|
||||||
|
"${PS}" -e -o pid,ppid,args 2> /dev/null \
|
||||||
|
| grep -v -e grep -e "${PID}.*ccollect.*${backup}" \
|
||||||
|
| grep "ccollect.*${backup}" > "${tmp_mgr}" 2> /dev/null
|
||||||
|
running_proc=`cat "${tmp_mgr}" | wc -l`
|
||||||
|
|
||||||
|
if [ ${running_proc} -gt 0 ]; then
|
||||||
|
# Remove backup from list
|
||||||
|
running_backups="${running_backups}${backup} "
|
||||||
|
|
||||||
|
echo "Process already running:"
|
||||||
|
cat "${tmp_mgr}"
|
||||||
|
|
||||||
|
remove_source ${source_no}
|
||||||
|
else
|
||||||
|
source_no=$(( ${source_no} + 1 ))
|
||||||
|
fi
|
||||||
|
rm "${tmp_mgr}"
|
||||||
|
done
|
||||||
|
|
||||||
|
if [ "${running_backups}" != "" ]; then
|
||||||
|
echo "skipping ccollect backups already running: ${running_backups}" | tee "${tmp_report}"
|
||||||
|
send_email "${tmp_report}" "WARNING - skipping ccollect backups already running: ${running_backups}"
|
||||||
|
rm "${tmp_report}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
find_interval()
|
||||||
|
{
|
||||||
|
# ----------------------------------------------------
|
||||||
|
# Find interval for ccollect backup.
|
||||||
|
# optional parameters:
|
||||||
|
# - Day of the week to do weekly backups
|
||||||
|
# - Do monthly instead of weekly on the Nth week
|
||||||
|
# ----------------------------------------------------
|
||||||
|
|
||||||
|
weekly_backup="$1"
|
||||||
|
monthly_backup="$2"
|
||||||
|
|
||||||
|
weekday=`date "+%w"`
|
||||||
|
if [ ${weekday} -eq ${weekly_backup} ]; then
|
||||||
|
dom=`date "+%e"`
|
||||||
|
weeknum=$(( ( ${dom} / 7 ) + 1 ))
|
||||||
|
if [ "${weeknum}" -eq "${monthly_backup}" ]; then
|
||||||
|
interval=monthly
|
||||||
|
else
|
||||||
|
interval=weekly
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
interval=daily
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
precheck_transfer_size()
|
||||||
|
{
|
||||||
|
# Check the estimated (worst case) transfer size and send email if larger than certain size
|
||||||
|
# Abort backup if total transfer is larger than maximum limit (ex: an error somewhere
|
||||||
|
# requires to do full backup and not incremental, which could blow the quota with ISP)
|
||||||
|
#
|
||||||
|
# Be nice and add error checking one day...
|
||||||
|
|
||||||
|
source_no=0
|
||||||
|
while [ "${source_no}" -lt "${no_sources}" ]; do
|
||||||
|
eval backup=\"\$source_${source_no}\"
|
||||||
|
eval ddir=\"\$ddir_${source_no}\"
|
||||||
|
|
||||||
|
last_dir="$(ls -tcp1 "${ddir}" | grep '/$' | head -n 1)"
|
||||||
|
sdir="$(cat "${CCOLLECT_CONF}"/sources/"${backup}"/source)"; ret="$?"
|
||||||
|
if [ -f "${CCOLLECT_CONF}"/sources/"${backup}"/exclude ]; then
|
||||||
|
exclude="--exclude-from=${CCOLLECT_CONF}/sources/${backup}/exclude";
|
||||||
|
else
|
||||||
|
exclude=""
|
||||||
|
fi
|
||||||
|
rsync_options=""
|
||||||
|
if [ -f "${CCOLLECT_CONF}"/sources/"${backup}"/rsync_options ]; then
|
||||||
|
while read line; do
|
||||||
|
rsync_options="${rsync_options} ${line}"
|
||||||
|
done < ${CCOLLECT_CONF}/sources/${backup}/rsync_options
|
||||||
|
fi
|
||||||
|
|
||||||
|
rsync -n -a --delete --stats ${rsync_options} "${exclude}" "${sdir}" "${ddir}/${last_dir}" > "${tmp_report}"
|
||||||
|
|
||||||
|
tx_rx=`cat "${tmp_report}" | grep "Total transferred file size" | \
|
||||||
|
awk '{ { tx += $5 } } END { printf "%u",(((tx)+1024*1024-1)/1024/1024); }'`
|
||||||
|
total_xfer=$(( ${total_xfer} + ${tx_rx} ))
|
||||||
|
|
||||||
|
source_no=$(( ${source_no} + 1 ))
|
||||||
|
done
|
||||||
|
|
||||||
|
echo "Transfer estimation for${ccollect_backups}: ${total_xfer} MB"
|
||||||
|
|
||||||
|
if [ ${total_xfer} -gt ${abort_transfer_size} ]; then
|
||||||
|
# --------------------------------------------------
|
||||||
|
# Send an error if transfer is larger than max limit
|
||||||
|
# --------------------------------------------------
|
||||||
|
# Useful to detect potential issues when there is transfer quota (ex: with ISP)
|
||||||
|
|
||||||
|
echo "Data transfer larger than ${abort_transfer_size} MB is expected for${ccollect_backups}" >> "${tmp_report}"
|
||||||
|
echo "** BACKUP ABORTED **" >> "${tmp_report}"
|
||||||
|
|
||||||
|
send_email "${tmp_report}" "ERROR: aborted ccollect for${ccollect_backups} -- Estimated Tx+Rx: ${total_xfer} MB"
|
||||||
|
rm "${tmp_report}"
|
||||||
|
exit 1
|
||||||
|
elif [ ${total_xfer} -gt ${warning_transfer_size} ]; then
|
||||||
|
# --------------------------------------------------
|
||||||
|
# Send a warning if transfer is expected to be large
|
||||||
|
# --------------------------------------------------
|
||||||
|
# Useful to detect potential issues when there is transfer quota (ex: with ISP)
|
||||||
|
|
||||||
|
echo "Data transfer larger than ${warning_transfer_size} MB is expected for${ccollect_backups}" > "${tmp_report}"
|
||||||
|
|
||||||
|
send_email "${tmp_report}" "WARNING ccollect for${ccollect_backups} -- Estimated Tx+Rx: ${total_xfer} MB"
|
||||||
|
rm "${tmp_report}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
build_backup_dir_list()
|
||||||
|
{
|
||||||
|
source_no=0
|
||||||
|
while [ "${source_no}" -lt "${no_sources}" ]; do
|
||||||
|
eval backup=\"\$source_${source_no}\"
|
||||||
|
eval ddir=\"\$ddir_${source_no}\"
|
||||||
|
|
||||||
|
backup_dir="`cat "${TEMP_LOG}" \
|
||||||
|
| grep "\[${backup}\] .*: Creating.* ${ddir}" \
|
||||||
|
| head -n 1 \
|
||||||
|
| sed 's/[^\/]*\//\//; s/ \.\.\.//'`"
|
||||||
|
|
||||||
|
if [ ! -d "${backup_dir}" ]; then
|
||||||
|
backups_not_found="${backups_not_found}\"${backup}\" "
|
||||||
|
echo -n "Backup directory for \"${backup}\" not found. "
|
||||||
|
remove_source "${source_no}"
|
||||||
|
else
|
||||||
|
eval export backup_dir_list_${source_no}="${backup_dir}"
|
||||||
|
# eval echo Backup Dir List: \"\$backup_dir_list_${source_no}\"
|
||||||
|
source_no=$(( ${source_no} + 1 ))
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
}
|
||||||
|
|
||||||
|
move_log()
|
||||||
|
{
|
||||||
|
if [ "${no_sources}" -gt 0 ]; then
|
||||||
|
eval log_file=\"\$backup_dir_list_1\"/ccollect.log
|
||||||
|
mv "${TEMP_LOG}" "${log_file}"
|
||||||
|
echo New Log Location: "${log_file}"
|
||||||
|
else
|
||||||
|
echo "WARNING: none of the backup set have been created"
|
||||||
|
log_file="${TEMP_LOG}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
send_report()
|
||||||
|
{
|
||||||
|
# Analyze log for periodic report and for error status report
|
||||||
|
cat "${log_file}" | ccollect_analyse_logs.sh iwe > "${tmp_report}"
|
||||||
|
|
||||||
|
# -------------------------
|
||||||
|
# Build the periodic report
|
||||||
|
# -------------------------
|
||||||
|
# Compute the total number of MB sent and received for all the backup sets
|
||||||
|
tx_rx=`cat "${tmp_report}" | \
|
||||||
|
grep 'sent [[:digit:]]* bytes received [0-9]* bytes' | \
|
||||||
|
awk '{ { tx += $3 } { rx += $6} } END \
|
||||||
|
{ printf "%u",(((tx+rx)+(1024*1024)-1)/1024/1024); }'`
|
||||||
|
current_date=`date +'20%y/%m/%d %Hh%M -- '`
|
||||||
|
|
||||||
|
# ------------------------------------------
|
||||||
|
# Get the real disk usage for the backup set
|
||||||
|
# ------------------------------------------
|
||||||
|
total_rdu=0
|
||||||
|
source_no=0
|
||||||
|
while [ "${source_no}" -lt "${no_sources}" ]; do
|
||||||
|
eval backup_dir=\"\$backup_dir_list_${source_no}\"
|
||||||
|
compute_rdu -m "${backup_dir}"
|
||||||
|
total_rdu=$(( ${total_rdu} + ${rdu} ))
|
||||||
|
source_no=$(( ${source_no} + 1 ))
|
||||||
|
done
|
||||||
|
|
||||||
|
# ---------------------------------------------------------
|
||||||
|
# Get the disk usage for all backups of each backup sets...
|
||||||
|
# ** BE PATIENT!!! **
|
||||||
|
# ---------------------------------------------------------
|
||||||
|
historical_rdu=0
|
||||||
|
source_no=0
|
||||||
|
while [ "${source_no}" -lt "${no_sources}" ]; do
|
||||||
|
eval backup_dir=\"\$ddir_${source_no}\"
|
||||||
|
compute_rdu -m "${backup_dir}"
|
||||||
|
historical_rdu=$(( ${historical_rdu} + ${rdu} ))
|
||||||
|
source_no=$(( ${source_no} + 1 ))
|
||||||
|
done
|
||||||
|
|
||||||
|
historical_rdu=$(( (${historical_rdu}+1023) / 1024 ))
|
||||||
|
|
||||||
|
if [ "${no_sources}" -gt 0 ]; then
|
||||||
|
ccollect_backups=""
|
||||||
|
else
|
||||||
|
ccollect_backups="(none performed) "
|
||||||
|
fi
|
||||||
|
|
||||||
|
source_no=0
|
||||||
|
while [ "${source_no}" -lt "${no_sources}" ]; do
|
||||||
|
eval backup=\"\$source_${source_no}\"
|
||||||
|
ccollect_backups="${ccollect_backups}\"${backup}\" "
|
||||||
|
source_no=$(( ${source_no} + 1 ))
|
||||||
|
done
|
||||||
|
|
||||||
|
echo ${current_date} Tx+Rx: ${tx_rx} MB -- \
|
||||||
|
Disk Usage: ${total_rdu} MB -- \
|
||||||
|
Backup set \(${interval}\):${ccollect_backups} -- \
|
||||||
|
Historical backups usage: ${historical_rdu} GB >> "${per_report}"
|
||||||
|
echo "Total Data Transfer: ${tx_rx} MB -- Total Disk Usage: ${total_rdu} MB -- Total Historical backups usage: ${historical_rdu} GB"
|
||||||
|
|
||||||
|
# ----------------------------------------
|
||||||
|
# Send a status email if there is an error
|
||||||
|
# ----------------------------------------
|
||||||
|
ccollect_we=`cat "${log_file}" | ccollect_analyse_logs.sh we | wc -l`
|
||||||
|
if [ ${ccollect_we} -ge 1 ]; then
|
||||||
|
send_email "${tmp_report}" "ERROR ccollect for${ccollect_backups} -- Tx+Rx: ${tx_rx} MB"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --------------------
|
||||||
|
# Send periodic report
|
||||||
|
# --------------------
|
||||||
|
if [ ${report_interval} == ${interval} ] || [ ${interval} == "monthly" ]; then
|
||||||
|
|
||||||
|
# Make reporting atomic to handle concurrent ccollect_mgr instances
|
||||||
|
mv "${per_report}" "${per_report}".$$
|
||||||
|
cat "${per_report}".$$ >> "${per_report}".history
|
||||||
|
|
||||||
|
# Calculate total amount of bytes sent and received
|
||||||
|
tx_rx=`cat "${per_report}".$$ | \
|
||||||
|
awk '{ { transfer += $5 } } END \
|
||||||
|
{ printf "%u",(transfer); }'`
|
||||||
|
|
||||||
|
# Send email
|
||||||
|
send_email "${per_report}.$$" "${report_interval} ccollect status for${ccollect_backups} -- Tx+Rx: ${tx_rx} MB"
|
||||||
|
rm "${per_report}.$$"
|
||||||
|
fi
|
||||||
|
|
||||||
|
rm "${tmp_report}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# ------------------------------------------------
|
||||||
|
# Add to PATH in case we're launching from crontab
|
||||||
|
# ------------------------------------------------
|
||||||
|
|
||||||
|
PATH="${ADD_TO_PATH}:${PATH}"
|
||||||
|
|
||||||
|
# --------------
|
||||||
|
# Default Values
|
||||||
|
# --------------
|
||||||
|
|
||||||
|
# Set on which interval status emails are sent (daily, weekly, monthly)
|
||||||
|
report_interval=weekly
|
||||||
|
|
||||||
|
# Set day of the week for weekly backups. Default is Monday
|
||||||
|
# 0=Sun, 1=Mon, 2=Tue, 3=Wed, 4=Thu, 5=Fri, 6=Sat
|
||||||
|
weekly_backup=1
|
||||||
|
|
||||||
|
# Set the monthly backup interval. Default is 4th Monday of every month
|
||||||
|
monthly_backup=4
|
||||||
|
|
||||||
|
# ---------------------------------
|
||||||
|
# Parse command line
|
||||||
|
# ---------------------------------
|
||||||
|
|
||||||
|
show_help=0
|
||||||
|
export no_sources=0
|
||||||
|
|
||||||
|
while [ "$#" -ge 1 ]; do
|
||||||
|
case "$1" in
|
||||||
|
-help)
|
||||||
|
show_help=1
|
||||||
|
;;
|
||||||
|
-from)
|
||||||
|
from="$2"
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
-to)
|
||||||
|
to="$2"
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
-server|mail_server)
|
||||||
|
mail_server="$2"
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
-weekly)
|
||||||
|
weekly_backup="$2"
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
-monthly)
|
||||||
|
monthly_backup="$2"
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
-warning_size)
|
||||||
|
warning_transfer_size="$2"
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
-abort_size)
|
||||||
|
abort_transfer_size="$2"
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
-report)
|
||||||
|
report_interval="$2"
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
-*)
|
||||||
|
ccollect_options="${ccollect_options}$1 "
|
||||||
|
;;
|
||||||
|
daily|weekly|monthly)
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
eval backup=\"\$1\"
|
||||||
|
ddir="$(cat "${CCOLLECT_CONF}"/sources/"${backup}"/destination)"; ret="$?"
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
echo "Destination ${CCOLLECT_CONF}/sources/${backup}/destination is not readable... Skipping."
|
||||||
|
else
|
||||||
|
ccollect_backups="${ccollect_backups} \"$1\""
|
||||||
|
eval export source_${no_sources}=\"\$1\"
|
||||||
|
eval export ddir_${no_sources}="${ddir}"
|
||||||
|
# eval echo Adding source \"\$source_${no_sources}\" -- \"\$ddir_${no_sources}\"
|
||||||
|
no_sources="$(( ${no_sources} + 1 ))"
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
shift
|
||||||
|
done
|
||||||
|
|
||||||
|
if [ "${no_sources}" -lt 1 ] || [ ${show_help} -eq 1 ]; then
|
||||||
|
echo ""
|
||||||
|
echo "$0: Syntax"
|
||||||
|
echo " -help This help"
|
||||||
|
echo " -from <email> From email address (ex.: -from nas@home.com)"
|
||||||
|
echo " -to <email> Send email to this address (ex.: -to me@home.com)"
|
||||||
|
echo " -server <smtp_addr> SMTP server used for sending emails"
|
||||||
|
echo " -weekly <day#> Define wich day of the week is the weekly backup"
|
||||||
|
echo " Default is ${weekly_backup}. Sunday = 0, Saturday = 6"
|
||||||
|
echo " -monthly <week#> Define on which week # is the monthly backup"
|
||||||
|
echo " Default is ${monthly_backup}. Value = 1 to 5"
|
||||||
|
echo " -report <interval> Frequency of report email (daily, weekly or monthly)"
|
||||||
|
echo " Default is ${report_interval}"
|
||||||
|
echo " -warning_size <MB> Send a warning email if the transfer size exceed this"
|
||||||
|
echo " Default is ${warning_transfer_size} MB"
|
||||||
|
echo " -abort_size <MB> Abort and send an error email if the transfer size exceed this"
|
||||||
|
echo " Default is ${abort_transfer_size} MB"
|
||||||
|
echo ""
|
||||||
|
echo " other parameters are transfered to ccollect"
|
||||||
|
echo ""
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
#echo Backup sets:"${ccollect_backups}"
|
||||||
|
check_running_backups
|
||||||
|
|
||||||
|
if [ "${no_sources}" -lt 1 ]; then
|
||||||
|
echo "No backup sets are reachable"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
find_interval ${weekly_backup} ${monthly_backup}
|
||||||
|
echo Interval: ${interval}
|
||||||
|
|
||||||
|
precheck_transfer_size
|
||||||
|
|
||||||
|
"${CCOLLECT}" ${ccollect_options} ${interval} ${ccollect_backups} | tee "${TEMP_LOG}"
|
||||||
|
|
||||||
|
build_backup_dir_list
|
||||||
|
move_log
|
||||||
|
|
||||||
|
send_report
|
||||||
|
|
||||||
65
contrib/ccollect_mgr/rdu
Normal file
65
contrib/ccollect_mgr/rdu
Normal file
|
|
@ -0,0 +1,65 @@
|
||||||
|
#!/bin/sh
|
||||||
|
#
|
||||||
|
# -------------------------------------------------------------
|
||||||
|
# Get the real disk usage for a group of selected files
|
||||||
|
#
|
||||||
|
# This script counts the size of the files and directories
|
||||||
|
# listed, but exclude files that have hard links referenced outside
|
||||||
|
# the list.
|
||||||
|
#
|
||||||
|
# The undelying objective of this script is to report the
|
||||||
|
# real amount of disk used for backup solutions that are heavily
|
||||||
|
# using hard links to save disk space on identical files (I use
|
||||||
|
# ccollect, but this likely works with rsnapshot)
|
||||||
|
# -------------------------------------------------------------
|
||||||
|
# 20091002 - initial release - pdrolet (rdu@drolet.name)
|
||||||
|
|
||||||
|
# --------------------
|
||||||
|
# Parse options
|
||||||
|
# --------------------
|
||||||
|
# Known problem:
|
||||||
|
# - Command line cannot get a directory with a space in it
|
||||||
|
#
|
||||||
|
kdivider=1
|
||||||
|
find_options=""
|
||||||
|
while [ "$#" -ge 1 ]; do
|
||||||
|
case "$1" in
|
||||||
|
-m)
|
||||||
|
kdivider=1024
|
||||||
|
;;
|
||||||
|
-g)
|
||||||
|
kdivider=1048576
|
||||||
|
;;
|
||||||
|
-h|--help)
|
||||||
|
echo
|
||||||
|
echo $0: \<directories\> \[options below and any \"find\" options\]
|
||||||
|
echo \ \ -m: result in mega bytes \(rounded up\)
|
||||||
|
echo \ \ -g: result in giga bytes \(rounded up\)
|
||||||
|
echo \ \ -h: this help
|
||||||
|
echo
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
find_options="${find_options} $1"
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
shift
|
||||||
|
done
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------------------------------------------
|
||||||
|
# Compute the size
|
||||||
|
# ------------------------------------------------------------------------------------------------------
|
||||||
|
# 1) Find selected files and list link count, inodes, file type and size
|
||||||
|
# 2) Sort (sorts on inodes since link count is constant per inode)
|
||||||
|
# 3) Merge duplicates using uniq
|
||||||
|
# (result is occurence count, link count, inode, file type and size)
|
||||||
|
# 4) Use awk to sum up the file size of each inodes when the occurence count
|
||||||
|
# and link count are the same. Use %k for size since awk's printf is 32 bits
|
||||||
|
# 5) Present the result with additional dividers based on command line parameters
|
||||||
|
#
|
||||||
|
echo $((( `find ${find_options} -printf '%n %i %y %k \n' \
|
||||||
|
| sort -n \
|
||||||
|
| uniq -c \
|
||||||
|
| awk '{ if (( $1 == $2 ) || ($4 == "d")) { sum += $5; } } END { printf "%u\n",(sum); }'` \
|
||||||
|
+ ${kdivider} -1 ) / ${kdivider} ))
|
||||||
|
|
||||||
1
contrib/exclude_lists/debian
Normal file
1
contrib/exclude_lists/debian
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
/var/cache/apt/archives/*
|
||||||
22
contrib/jbrendel-autobackup/backup.sh
Normal file
22
contrib/jbrendel-autobackup/backup.sh
Normal file
|
|
@ -0,0 +1,22 @@
|
||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
function mkbackup {
|
||||||
|
find /etc/ccollect/logwrapper/destination -type f -atime +2 -exec sudo rm {} \;
|
||||||
|
/home/jcb/bm.pl &
|
||||||
|
}
|
||||||
|
|
||||||
|
mkdir -p /media/backupdisk
|
||||||
|
grep backupdisk /etc/mtab &> /dev/null
|
||||||
|
|
||||||
|
if [ $? == 0 ]
|
||||||
|
then
|
||||||
|
mkbackup
|
||||||
|
else
|
||||||
|
mount /media/backupdisk
|
||||||
|
if [ $? == 0 ]
|
||||||
|
then
|
||||||
|
mkbackup
|
||||||
|
else
|
||||||
|
echo "Error mounting backup disk"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
242
contrib/jbrendel-autobackup/bm.pl
Normal file
242
contrib/jbrendel-autobackup/bm.pl
Normal file
|
|
@ -0,0 +1,242 @@
|
||||||
|
#!/usr/bin/perl
|
||||||
|
|
||||||
|
###############################
|
||||||
|
#
|
||||||
|
# Jens-Christoph Brendel, 2009
|
||||||
|
# licensed under GPL3 NO WARRANTY
|
||||||
|
#
|
||||||
|
###############################
|
||||||
|
|
||||||
|
use Date::Calc qw(:all);
|
||||||
|
use strict;
|
||||||
|
use warnings;
|
||||||
|
|
||||||
|
#
|
||||||
|
#!!!!!!!!!!!!!!!!! you need to customize these settings !!!!!!!!!!!!!!!!!!!!
|
||||||
|
#
|
||||||
|
my $backupdir = "/media/backupdisk";
|
||||||
|
my $logwrapper = "/home/jcb/ccollect/tools/ccollect_logwrapper.sh";
|
||||||
|
|
||||||
|
#!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
|
||||||
|
|
||||||
|
# +------------------------------------------------------------------------+
|
||||||
|
# | |
|
||||||
|
# | V A R I A B L E S |
|
||||||
|
# | |
|
||||||
|
# +------------------------------------------------------------------------+
|
||||||
|
#
|
||||||
|
|
||||||
|
# get the current date
|
||||||
|
#
|
||||||
|
my ($sek, $min, $hour, $day, $month, $year) = localtime();
|
||||||
|
|
||||||
|
my $curr_year = $year + 1900;
|
||||||
|
my $curr_month = $month +1;
|
||||||
|
my ($curr_week,$cur_year) = Week_of_Year($curr_year,$curr_month,$day);
|
||||||
|
|
||||||
|
# initialize some variables
|
||||||
|
#
|
||||||
|
my %most_recent_daily = (
|
||||||
|
'age' => 9999,
|
||||||
|
'file' => ''
|
||||||
|
);
|
||||||
|
|
||||||
|
my %most_recent_weekly = (
|
||||||
|
'age' => 9999,
|
||||||
|
'file' => ''
|
||||||
|
);
|
||||||
|
|
||||||
|
my %most_recent_monthly = (
|
||||||
|
'age' => 9999,
|
||||||
|
'file' => ''
|
||||||
|
);
|
||||||
|
|
||||||
|
# prepare the output formatting
|
||||||
|
#
|
||||||
|
#---------------------------------------------------------------------------
|
||||||
|
my ($msg1, $msg2, $msg3, $msg4);
|
||||||
|
|
||||||
|
format =
|
||||||
|
@<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
|
||||||
|
$msg1
|
||||||
|
@<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< @<<<<<<<<<<<<<<<<<
|
||||||
|
$msg2, $msg3
|
||||||
|
|
||||||
|
@||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
||||||
|
$msg4
|
||||||
|
.
|
||||||
|
|
||||||
|
my @months = (' ','January', 'February', 'March', 'April',
|
||||||
|
'May', 'June', 'July', 'August',
|
||||||
|
'September', 'October', 'November',
|
||||||
|
'December');
|
||||||
|
|
||||||
|
# +------------------------------------------------------------------------+
|
||||||
|
# | |
|
||||||
|
# | P r o c e d u r e s |
|
||||||
|
# | |
|
||||||
|
# +------------------------------------------------------------------------+
|
||||||
|
#
|
||||||
|
|
||||||
|
# PURPOSE: extract the date from the file name
|
||||||
|
# PARAMETER VALUE: file name
|
||||||
|
# RETURN VALUE: pointer of a hash containing year, month, day
|
||||||
|
#
|
||||||
|
sub decodeDate {
|
||||||
|
my $file = shift;
|
||||||
|
$file =~ /^(daily|weekly|monthly)\.(\d+)-.*/;
|
||||||
|
my %date = (
|
||||||
|
'y' => substr($2,0,4),
|
||||||
|
'm' => substr($2,4,2),
|
||||||
|
'd' => substr($2,6,2)
|
||||||
|
);
|
||||||
|
return \%date;
|
||||||
|
}
|
||||||
|
|
||||||
|
# PURPOSE: calculate the file age in days
|
||||||
|
# PARAMETER VALUE: name of a ccollect backup file
|
||||||
|
# RETURN VALUE: age in days
|
||||||
|
#
|
||||||
|
sub AgeInDays {
|
||||||
|
my $file = shift;
|
||||||
|
my $date=decodeDate($file);
|
||||||
|
my $ageindays = Delta_Days($$date{'y'}, $$date{'m'}, $$date{'d'}, $curr_year, $curr_month, $day);
|
||||||
|
return $ageindays;
|
||||||
|
}
|
||||||
|
|
||||||
|
# PURPOSE: calculate the file age in number of weeks
|
||||||
|
# PARAMETER VALUE: name of a ccollect backup file
|
||||||
|
# RETURN VALUE: age in weeks
|
||||||
|
#
|
||||||
|
sub AgeInWeeks {
|
||||||
|
my($y,$m,$d);
|
||||||
|
|
||||||
|
my $file = shift;
|
||||||
|
my $date = decodeDate($file);
|
||||||
|
my ($weeknr,$yr) = Week_of_Year($$date{'y'}, $$date{'m'}, $$date{'d'});
|
||||||
|
my $ageinweeks = $curr_week - $weeknr;
|
||||||
|
return $ageinweeks;
|
||||||
|
}
|
||||||
|
|
||||||
|
# PURPOSE: calculate the file age in number of months
|
||||||
|
# PARAMETER VALUE: name of a ccollect backup file
|
||||||
|
# RETURN VALUE: age in months
|
||||||
|
#
|
||||||
|
sub AgeInMonths {
|
||||||
|
my $ageinmonths;
|
||||||
|
my $ageinmonths;
|
||||||
|
my $file = shift;
|
||||||
|
my $date = decodeDate($file);
|
||||||
|
if ($curr_year == $$date{'y'}) {
|
||||||
|
$ageinmonths = $curr_month - $$date{'m'};
|
||||||
|
} else {
|
||||||
|
$ageinmonths = $curr_month + (12-$$date{'m'}) + ($curr_year-$$date{'y'}-1)*12;
|
||||||
|
}
|
||||||
|
return $ageinmonths;
|
||||||
|
}
|
||||||
|
|
||||||
|
# +------------------------------------------------------------------------+
|
||||||
|
# | |
|
||||||
|
# | M A I N |
|
||||||
|
# | |
|
||||||
|
# +------------------------------------------------------------------------+
|
||||||
|
#
|
||||||
|
|
||||||
|
#
|
||||||
|
# find the most recent daily, weekly and monthly backup file
|
||||||
|
#
|
||||||
|
|
||||||
|
opendir(DIRH, $backupdir) or die "Can't open $backupdir \n";
|
||||||
|
|
||||||
|
my @files = readdir(DIRH);
|
||||||
|
|
||||||
|
die "Zielverzeichnis leer \n" if ( $#files <= 1 );
|
||||||
|
|
||||||
|
foreach my $file (@files) {
|
||||||
|
|
||||||
|
next if $file eq "." or $file eq "..";
|
||||||
|
|
||||||
|
SWITCH: {
|
||||||
|
if ($file =~ /^daily/) {
|
||||||
|
my $curr_age=AgeInDays($file);
|
||||||
|
if ($curr_age<$most_recent_daily{'age'}) {
|
||||||
|
$most_recent_daily{'age'} =$curr_age;
|
||||||
|
$most_recent_daily{'file'}= $file;
|
||||||
|
}
|
||||||
|
last SWITCH;
|
||||||
|
}
|
||||||
|
|
||||||
|
if ($file =~ /^weekly/) {
|
||||||
|
my $curr_week_age = AgeInWeeks($file);
|
||||||
|
if ($curr_week_age<$most_recent_weekly{'age'}) {
|
||||||
|
$most_recent_weekly{'age'} =$curr_week_age;
|
||||||
|
$most_recent_weekly{'file'}=$file;
|
||||||
|
}
|
||||||
|
last SWITCH;
|
||||||
|
}
|
||||||
|
|
||||||
|
if ($file =~ /^monthly/) {
|
||||||
|
my $curr_month_age=AgeInMonths($file);
|
||||||
|
if ($curr_month_age < $most_recent_monthly{'age'}) {
|
||||||
|
$most_recent_monthly{'age'} =$curr_month_age;
|
||||||
|
$most_recent_monthly{'file'}=$file;
|
||||||
|
}
|
||||||
|
last SWITCH;
|
||||||
|
}
|
||||||
|
print "\n\n unknown file $file \n\n";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
printf("\nBackup Manager started: %02u.%02u. %u, week %02u\n\n", $day, $curr_month, $curr_year, $curr_week);
|
||||||
|
|
||||||
|
#
|
||||||
|
# compare the most recent daily, weekly and monthly backup file
|
||||||
|
# and decide if it's necessary to start a new backup process in
|
||||||
|
# each category
|
||||||
|
#
|
||||||
|
|
||||||
|
if ($most_recent_monthly{'age'} == 0) {
|
||||||
|
$msg1="The most recent monthly backup";
|
||||||
|
$msg2="$most_recent_monthly{'file'} from $months[$curr_month - $most_recent_monthly{'age'}]";
|
||||||
|
$msg3="is still valid.";
|
||||||
|
$msg4="";
|
||||||
|
write;
|
||||||
|
} else {
|
||||||
|
$msg1="The most recent monthly backup";
|
||||||
|
$msg2="$most_recent_monthly{'file'} from $months[$curr_month - $most_recent_monthly{'age'}]";
|
||||||
|
$msg3="is out-dated.";
|
||||||
|
$msg4="Starting new monthly backup.";
|
||||||
|
write;
|
||||||
|
exec "sudo $logwrapper monthly FULL";
|
||||||
|
exit;
|
||||||
|
}
|
||||||
|
|
||||||
|
if ($most_recent_weekly{'age'} == 0) {
|
||||||
|
$msg1="The most recent weekly backup";
|
||||||
|
$msg2="$most_recent_weekly{'file'} from week nr: $curr_week-$most_recent_weekly{'age'}";
|
||||||
|
$msg3="is still valid.";
|
||||||
|
$msg4="";
|
||||||
|
write;
|
||||||
|
} else {
|
||||||
|
$msg1="The most recent weekly backup";
|
||||||
|
$msg2="$most_recent_weekly{'file'} from week nr: $curr_week-$most_recent_weekly{'age'}";
|
||||||
|
$msg3="is out-dated.";
|
||||||
|
$msg4="Starting new weekly backup.";
|
||||||
|
write;
|
||||||
|
exec "sudo $logwrapper weekly FULL";
|
||||||
|
exit;
|
||||||
|
}
|
||||||
|
|
||||||
|
if ($most_recent_daily{'age'} == 0 ) {
|
||||||
|
$msg1=" The most recent daily backup";
|
||||||
|
$msg2="$most_recent_daily{'file'}";
|
||||||
|
$msg3="is still valid.";
|
||||||
|
$msg4="";
|
||||||
|
write;
|
||||||
|
} else {
|
||||||
|
$msg1="The most recent daily backup";
|
||||||
|
$msg2="$most_recent_daily{'file'}";
|
||||||
|
$msg3="is out-dated.";
|
||||||
|
$msg4="Starting new daily backup.";
|
||||||
|
write;
|
||||||
|
exec "sudo $logwrapper daily FULL";
|
||||||
3
contrib/jbrendel-autobackup/correction_1
Normal file
3
contrib/jbrendel-autobackup/correction_1
Normal file
|
|
@ -0,0 +1,3 @@
|
||||||
|
- Zeile 126/127 (my $ageinmonths;) ist doppelt, einmal streichen.
|
||||||
|
- in die allerletzte Zeile gehört eine schließende geschweifte Klammer
|
||||||
|
"}", die irgendwo verlorengegangen ist.
|
||||||
15
contrib/jlawless-2009-06-03/README_g-i.txt
Normal file
15
contrib/jlawless-2009-06-03/README_g-i.txt
Normal file
|
|
@ -0,0 +1,15 @@
|
||||||
|
Hello Nico,
|
||||||
|
|
||||||
|
I have attached three more patches for ccollect. Each patch
|
||||||
|
has comments explaining its motivation.
|
||||||
|
|
||||||
|
All of these patches work-for-me (but I continue to test
|
||||||
|
them). I would be interested in your opinion on, for example, the
|
||||||
|
general approach used in i.patch which changes the way options are
|
||||||
|
handled. I think it is a big improvement. If, however, you wanted
|
||||||
|
the code to go in a different direction, let me know before we
|
||||||
|
diverge too far.
|
||||||
|
|
||||||
|
Regards,
|
||||||
|
|
||||||
|
John
|
||||||
683
contrib/jlawless-2009-06-03/ccollect-f.sh
Executable file
683
contrib/jlawless-2009-06-03/ccollect-f.sh
Executable file
|
|
@ -0,0 +1,683 @@
|
||||||
|
#!/bin/sh
|
||||||
|
#
|
||||||
|
# 2005-2009 Nico Schottelius (nico-ccollect at schottelius.org)
|
||||||
|
#
|
||||||
|
# This file is part of ccollect.
|
||||||
|
#
|
||||||
|
# ccollect is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# ccollect is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with ccollect. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
#
|
||||||
|
# Initially written for SyGroup (www.sygroup.ch)
|
||||||
|
# Date: Mon Nov 14 11:45:11 CET 2005
|
||||||
|
|
||||||
|
#
|
||||||
|
# Standard variables (stolen from cconf)
|
||||||
|
#
|
||||||
|
__pwd="$(pwd -P)"
|
||||||
|
__mydir="${0%/*}"; __abs_mydir="$(cd "$__mydir" && pwd -P)"
|
||||||
|
__myname=${0##*/}; __abs_myname="$__abs_mydir/$__myname"
|
||||||
|
|
||||||
|
#
|
||||||
|
# where to find our configuration and temporary file
|
||||||
|
#
|
||||||
|
CCOLLECT_CONF=${CCOLLECT_CONF:-/etc/ccollect}
|
||||||
|
CSOURCES=${CCOLLECT_CONF}/sources
|
||||||
|
CDEFAULTS=${CCOLLECT_CONF}/defaults
|
||||||
|
CPREEXEC="${CDEFAULTS}/pre_exec"
|
||||||
|
CPOSTEXEC="${CDEFAULTS}/post_exec"
|
||||||
|
|
||||||
|
TMP=$(mktemp "/tmp/${__myname}.XXXXXX")
|
||||||
|
VERSION=0.7.1
|
||||||
|
RELEASE="2009-02-02"
|
||||||
|
HALF_VERSION="ccollect ${VERSION}"
|
||||||
|
FULL_VERSION="ccollect ${VERSION} (${RELEASE})"
|
||||||
|
|
||||||
|
#TSORT="tc" ; NEWER="cnewer"
|
||||||
|
TSORT="t" ; NEWER="newer"
|
||||||
|
|
||||||
|
#
|
||||||
|
# CDATE: how we use it for naming of the archives
|
||||||
|
# DDATE: how the user should see it in our output (DISPLAY)
|
||||||
|
#
|
||||||
|
CDATE="date +%Y%m%d-%H%M"
|
||||||
|
DDATE="date +%Y-%m-%d-%H:%M:%S"
|
||||||
|
|
||||||
|
#
|
||||||
|
# unset parallel execution
|
||||||
|
#
|
||||||
|
PARALLEL=""
|
||||||
|
|
||||||
|
#
|
||||||
|
# catch signals
|
||||||
|
#
|
||||||
|
trap "rm -f \"${TMP}\"" 1 2 15
|
||||||
|
|
||||||
|
#
|
||||||
|
# Functions
|
||||||
|
#
|
||||||
|
|
||||||
|
# time displaying echo
|
||||||
|
_techo()
|
||||||
|
{
|
||||||
|
echo "$(${DDATE}): $@"
|
||||||
|
}
|
||||||
|
|
||||||
|
# exit on error
|
||||||
|
_exit_err()
|
||||||
|
{
|
||||||
|
_techo "$@"
|
||||||
|
rm -f "${TMP}"
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
|
||||||
|
add_name()
|
||||||
|
{
|
||||||
|
awk "{ print \"[${name}] \" \$0 }"
|
||||||
|
}
|
||||||
|
|
||||||
|
pcmd()
|
||||||
|
{
|
||||||
|
if [ "$remote_host" ]; then
|
||||||
|
ssh "$remote_host" "$@"
|
||||||
|
else
|
||||||
|
"$@"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# Version
|
||||||
|
#
|
||||||
|
display_version()
|
||||||
|
{
|
||||||
|
echo "${FULL_VERSION}"
|
||||||
|
exit 0
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# Tell how to use us
|
||||||
|
#
|
||||||
|
usage()
|
||||||
|
{
|
||||||
|
echo "${__myname}: <interval name> [args] <sources to backup>"
|
||||||
|
echo ""
|
||||||
|
echo " ccollect creates (pseudo) incremental backups"
|
||||||
|
echo ""
|
||||||
|
echo " -h, --help: Show this help screen"
|
||||||
|
echo " -p, --parallel: Parallelise backup processes"
|
||||||
|
echo " -a, --all: Backup all sources specified in ${CSOURCES}"
|
||||||
|
echo " -v, --verbose: Be very verbose (uses set -x)"
|
||||||
|
echo " -V, --version: Print version information"
|
||||||
|
echo ""
|
||||||
|
echo " This is version ${VERSION}, released on ${RELEASE}"
|
||||||
|
echo " (the first version was written on 2005-12-05 by Nico Schottelius)."
|
||||||
|
echo ""
|
||||||
|
echo " Retrieve latest ccollect at http://unix.schottelius.org/ccollect/"
|
||||||
|
exit 0
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# Select interval if AUTO
|
||||||
|
#
|
||||||
|
# For this to work nicely, you have to choose interval names that sort nicely
|
||||||
|
# such as int1, int2, int3 or a_daily, b_weekly, c_monthly, etc.
|
||||||
|
#
|
||||||
|
auto_interval()
|
||||||
|
{
|
||||||
|
if [ -d "${backup}/intervals" -a -n "$(ls "${backup}/intervals" 2>/dev/null)" ] ; then
|
||||||
|
intervals_dir="${backup}/intervals"
|
||||||
|
elif [ -d "${CDEFAULTS}/intervals" -a -n "$(ls "${CDEFAULTS}/intervals" 2>/dev/null)" ] ; then
|
||||||
|
intervals_dir="${CDEFAULTS}/intervals"
|
||||||
|
else
|
||||||
|
_exit_err "No intervals are defined. Skipping."
|
||||||
|
fi
|
||||||
|
echo intervals_dir=${intervals_dir}
|
||||||
|
|
||||||
|
trial_interval="$(ls -1r "${intervals_dir}/" | head -n 1)" || \
|
||||||
|
_exit_err "Failed to list contents of ${intervals_dir}/."
|
||||||
|
_techo "Considering interval ${trial_interval}"
|
||||||
|
most_recent="$(pcmd ls -${TSORT}p1 "${ddir}" | grep "^${trial_interval}.*/$" | head -n 1)" || \
|
||||||
|
_exit_err "Failed to list contents of ${ddir}/."
|
||||||
|
_techo " Most recent ${trial_interval}: '${most_recent}'"
|
||||||
|
if [ -n "${most_recent}" ]; then
|
||||||
|
no_intervals="$(ls -1 "${intervals_dir}/" | wc -l)"
|
||||||
|
n=1
|
||||||
|
while [ "${n}" -le "${no_intervals}" ]; do
|
||||||
|
trial_interval="$(ls -p1 "${intervals_dir}/" | tail -n+${n} | head -n 1)"
|
||||||
|
_techo "Considering interval '${trial_interval}'"
|
||||||
|
c_interval="$(cat "${intervals_dir}/${trial_interval}" 2>/dev/null)"
|
||||||
|
m=$((${n}+1))
|
||||||
|
set -- "${ddir}" -maxdepth 1
|
||||||
|
while [ "${m}" -le "${no_intervals}" ]; do
|
||||||
|
interval_m="$(ls -1 "${intervals_dir}/" | tail -n+${m} | head -n 1)"
|
||||||
|
most_recent="$(pcmd ls -${TSORT}p1 "${ddir}" | grep "^${interval_m}\..*/$" | head -n 1)"
|
||||||
|
_techo " Most recent ${interval_m}: '${most_recent}'"
|
||||||
|
if [ -n "${most_recent}" ] ; then
|
||||||
|
set -- "$@" -$NEWER "${ddir}/${most_recent}"
|
||||||
|
fi
|
||||||
|
m=$((${m}+1))
|
||||||
|
done
|
||||||
|
count=$(pcmd find "$@" -iname "${trial_interval}*" | wc -l)
|
||||||
|
_techo " Found $count more recent backups of ${trial_interval} (limit: ${c_interval})"
|
||||||
|
if [ "$count" -lt "${c_interval}" ] ; then
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
n=$((${n}+1))
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
export INTERVAL="${trial_interval}"
|
||||||
|
D_FILE_INTERVAL="${intervals_dir}/${INTERVAL}"
|
||||||
|
D_INTERVAL=$(cat "${D_FILE_INTERVAL}" 2>/dev/null)
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# need at least interval and one source or --all
|
||||||
|
#
|
||||||
|
if [ $# -lt 2 ]; then
|
||||||
|
if [ "$1" = "-V" -o "$1" = "--version" ]; then
|
||||||
|
display_version
|
||||||
|
else
|
||||||
|
usage
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# check for configuraton directory
|
||||||
|
#
|
||||||
|
[ -d "${CCOLLECT_CONF}" ] || _exit_err "No configuration found in " \
|
||||||
|
"\"${CCOLLECT_CONF}\" (is \$CCOLLECT_CONF properly set?)"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Filter arguments
|
||||||
|
#
|
||||||
|
export INTERVAL="$1"; shift
|
||||||
|
i=1
|
||||||
|
no_sources=0
|
||||||
|
|
||||||
|
#
|
||||||
|
# Create source "array"
|
||||||
|
#
|
||||||
|
while [ "$#" -ge 1 ]; do
|
||||||
|
eval arg=\"\$1\"; shift
|
||||||
|
|
||||||
|
if [ "${NO_MORE_ARGS}" = 1 ]; then
|
||||||
|
eval source_${no_sources}=\"${arg}\"
|
||||||
|
no_sources=$((${no_sources}+1))
|
||||||
|
|
||||||
|
# make variable available for subscripts
|
||||||
|
eval export source_${no_sources}
|
||||||
|
else
|
||||||
|
case "${arg}" in
|
||||||
|
-a|--all)
|
||||||
|
ALL=1
|
||||||
|
;;
|
||||||
|
-v|--verbose)
|
||||||
|
VERBOSE=1
|
||||||
|
;;
|
||||||
|
-p|--parallel)
|
||||||
|
PARALLEL=1
|
||||||
|
;;
|
||||||
|
-h|--help)
|
||||||
|
usage
|
||||||
|
;;
|
||||||
|
--)
|
||||||
|
NO_MORE_ARGS=1
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
eval source_${no_sources}=\"$arg\"
|
||||||
|
no_sources=$(($no_sources+1))
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
fi
|
||||||
|
|
||||||
|
i=$(($i+1))
|
||||||
|
done
|
||||||
|
|
||||||
|
# also export number of sources
|
||||||
|
export no_sources
|
||||||
|
|
||||||
|
#
|
||||||
|
# be really, really, really verbose
|
||||||
|
#
|
||||||
|
if [ "${VERBOSE}" = 1 ]; then
|
||||||
|
set -x
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Look, if we should take ALL sources
|
||||||
|
#
|
||||||
|
if [ "${ALL}" = 1 ]; then
|
||||||
|
# reset everything specified before
|
||||||
|
no_sources=0
|
||||||
|
|
||||||
|
#
|
||||||
|
# get entries from sources
|
||||||
|
#
|
||||||
|
cwd=$(pwd -P)
|
||||||
|
( cd "${CSOURCES}" && ls > "${TMP}" ); ret=$?
|
||||||
|
|
||||||
|
[ "${ret}" -eq 0 ] || _exit_err "Listing of sources failed. Aborting."
|
||||||
|
|
||||||
|
while read tmp; do
|
||||||
|
eval source_${no_sources}=\"${tmp}\"
|
||||||
|
no_sources=$((${no_sources}+1))
|
||||||
|
done < "${TMP}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Need at least ONE source to backup
|
||||||
|
#
|
||||||
|
if [ "${no_sources}" -lt 1 ]; then
|
||||||
|
usage
|
||||||
|
else
|
||||||
|
_techo "${HALF_VERSION}: Beginning backup using interval ${INTERVAL}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Look for pre-exec command (general)
|
||||||
|
#
|
||||||
|
if [ -x "${CPREEXEC}" ]; then
|
||||||
|
_techo "Executing ${CPREEXEC} ..."
|
||||||
|
"${CPREEXEC}"; ret=$?
|
||||||
|
_techo "Finished ${CPREEXEC} (return code: ${ret})."
|
||||||
|
|
||||||
|
[ "${ret}" -eq 0 ] || _exit_err "${CPREEXEC} failed. Aborting"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# check default configuration
|
||||||
|
#
|
||||||
|
|
||||||
|
D_FILE_INTERVAL="${CDEFAULTS}/intervals/${INTERVAL}"
|
||||||
|
D_INTERVAL=$(cat "${D_FILE_INTERVAL}" 2>/dev/null)
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Let's do the backup
|
||||||
|
#
|
||||||
|
i=0
|
||||||
|
while [ "${i}" -lt "${no_sources}" ]; do
|
||||||
|
|
||||||
|
#
|
||||||
|
# Get current source
|
||||||
|
#
|
||||||
|
eval name=\"\$source_${i}\"
|
||||||
|
i=$((${i}+1))
|
||||||
|
|
||||||
|
export name
|
||||||
|
|
||||||
|
#
|
||||||
|
# start ourself, if we want parallel execution
|
||||||
|
#
|
||||||
|
if [ "${PARALLEL}" ]; then
|
||||||
|
"$0" "${INTERVAL}" "${name}" &
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Start subshell for easy log editing
|
||||||
|
#
|
||||||
|
(
|
||||||
|
#
|
||||||
|
# Stderr to stdout, so we can produce nice logs
|
||||||
|
#
|
||||||
|
exec 2>&1
|
||||||
|
|
||||||
|
#
|
||||||
|
# Configuration
|
||||||
|
#
|
||||||
|
backup="${CSOURCES}/${name}"
|
||||||
|
c_source="${backup}/source"
|
||||||
|
c_dest="${backup}/destination"
|
||||||
|
c_exclude="${backup}/exclude"
|
||||||
|
c_verbose="${backup}/verbose"
|
||||||
|
c_vverbose="${backup}/very_verbose"
|
||||||
|
c_rsync_extra="${backup}/rsync_options"
|
||||||
|
c_summary="${backup}/summary"
|
||||||
|
c_pre_exec="${backup}/pre_exec"
|
||||||
|
c_post_exec="${backup}/post_exec"
|
||||||
|
f_incomplete="delete_incomplete"
|
||||||
|
c_incomplete="${backup}/${f_incomplete}"
|
||||||
|
c_remote_host="${backup}/remote_host"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Marking backups: If we abort it's not removed => Backup is broken
|
||||||
|
#
|
||||||
|
c_marker=".ccollect-marker"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Times
|
||||||
|
#
|
||||||
|
begin_s=$(date +%s)
|
||||||
|
|
||||||
|
#
|
||||||
|
# unset possible options
|
||||||
|
#
|
||||||
|
EXCLUDE=""
|
||||||
|
RSYNC_EXTRA=""
|
||||||
|
SUMMARY=""
|
||||||
|
VERBOSE=""
|
||||||
|
VVERBOSE=""
|
||||||
|
DELETE_INCOMPLETE=""
|
||||||
|
|
||||||
|
_techo "Beginning to backup"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Standard configuration checks
|
||||||
|
#
|
||||||
|
if [ ! -e "${backup}" ]; then
|
||||||
|
_exit_err "Source does not exist."
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# configuration _must_ be a directory
|
||||||
|
#
|
||||||
|
if [ ! -d "${backup}" ]; then
|
||||||
|
_exit_err "\"${name}\" is not a cconfig-directory. Skipping."
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# first execute pre_exec, which may generate destination or other
|
||||||
|
# parameters
|
||||||
|
#
|
||||||
|
if [ -x "${c_pre_exec}" ]; then
|
||||||
|
_techo "Executing ${c_pre_exec} ..."
|
||||||
|
"${c_pre_exec}"; ret="$?"
|
||||||
|
_techo "Finished ${c_pre_exec} (return code ${ret})."
|
||||||
|
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_exit_err "${c_pre_exec} failed. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Destination is a path
|
||||||
|
#
|
||||||
|
if [ ! -f "${c_dest}" ]; then
|
||||||
|
_exit_err "Destination ${c_dest} is not a file. Skipping."
|
||||||
|
else
|
||||||
|
ddir=$(cat "${c_dest}"); ret="$?"
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_exit_err "Destination ${c_dest} is not readable. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# interval definition: First try source specific, fallback to default
|
||||||
|
#
|
||||||
|
if [ ${INTERVAL} = "AUTO" ] ; then
|
||||||
|
auto_interval
|
||||||
|
_techo "Selected interval: '$INTERVAL'"
|
||||||
|
fi
|
||||||
|
c_interval="$(cat "${backup}/intervals/${INTERVAL}" 2>/dev/null)"
|
||||||
|
|
||||||
|
if [ -z "${c_interval}" ]; then
|
||||||
|
c_interval="${D_INTERVAL}"
|
||||||
|
|
||||||
|
if [ -z "${c_interval}" ]; then
|
||||||
|
_exit_err "No definition for interval \"${INTERVAL}\" found. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Source checks
|
||||||
|
#
|
||||||
|
if [ ! -f "${c_source}" ]; then
|
||||||
|
_exit_err "Source description \"${c_source}\" is not a file. Skipping."
|
||||||
|
else
|
||||||
|
source=$(cat "${c_source}"); ret="$?"
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_exit_err "Source ${c_source} is not readable. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
# Verify source is up and accepting connections before deleting any old backups
|
||||||
|
rsync "$source" >/dev/null || _exit_err "Source ${source} is not readable. Skipping."
|
||||||
|
|
||||||
|
#
|
||||||
|
# do we backup to a remote host? then set pre-cmd
|
||||||
|
#
|
||||||
|
if [ -f "${c_remote_host}" ]; then
|
||||||
|
# adjust ls and co
|
||||||
|
remote_host=$(cat "${c_remote_host}"); ret="$?"
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_exit_err "Remote host file ${c_remote_host} exists, but is not readable. Skipping."
|
||||||
|
fi
|
||||||
|
destination="${remote_host}:${ddir}"
|
||||||
|
else
|
||||||
|
remote_host=""
|
||||||
|
destination="${ddir}"
|
||||||
|
fi
|
||||||
|
export remote_host
|
||||||
|
|
||||||
|
#
|
||||||
|
# check for existence / use real name
|
||||||
|
#
|
||||||
|
( pcmd cd "$ddir" ) || _exit_err "Cannot change to ${ddir}. Skipping."
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check whether to delete incomplete backups
|
||||||
|
#
|
||||||
|
if [ -f "${c_incomplete}" -o -f "${CDEFAULTS}/${f_incomplete}" ]; then
|
||||||
|
DELETE_INCOMPLETE="yes"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# NEW method as of 0.6:
|
||||||
|
# - insert ccollect default parameters
|
||||||
|
# - insert options
|
||||||
|
# - insert user options
|
||||||
|
|
||||||
|
#
|
||||||
|
# rsync standard options
|
||||||
|
#
|
||||||
|
|
||||||
|
set -- "$@" "--archive" "--delete" "--numeric-ids" "--relative" \
|
||||||
|
"--delete-excluded" "--sparse"
|
||||||
|
|
||||||
|
#
|
||||||
|
# exclude list
|
||||||
|
#
|
||||||
|
if [ -f "${c_exclude}" ]; then
|
||||||
|
set -- "$@" "--exclude-from=${c_exclude}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Output a summary
|
||||||
|
#
|
||||||
|
if [ -f "${c_summary}" ]; then
|
||||||
|
set -- "$@" "--stats"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Verbosity for rsync
|
||||||
|
#
|
||||||
|
if [ -f "${c_vverbose}" ]; then
|
||||||
|
set -- "$@" "-vv"
|
||||||
|
elif [ -f "${c_verbose}" ]; then
|
||||||
|
set -- "$@" "-v"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# extra options for rsync provided by the user
|
||||||
|
#
|
||||||
|
if [ -f "${c_rsync_extra}" ]; then
|
||||||
|
while read line; do
|
||||||
|
set -- "$@" "$line"
|
||||||
|
done < "${c_rsync_extra}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check for incomplete backups
|
||||||
|
#
|
||||||
|
pcmd ls -1 "$ddir/${INTERVAL}"*".${c_marker}" > "${TMP}" 2>/dev/null
|
||||||
|
|
||||||
|
i=0
|
||||||
|
while read incomplete; do
|
||||||
|
eval incomplete_$i=\"$(echo ${incomplete} | sed "s/\\.${c_marker}\$//")\"
|
||||||
|
i=$(($i+1))
|
||||||
|
done < "${TMP}"
|
||||||
|
|
||||||
|
j=0
|
||||||
|
while [ "$j" -lt "$i" ]; do
|
||||||
|
eval realincomplete=\"\$incomplete_$j\"
|
||||||
|
_techo "Incomplete backup: ${realincomplete}"
|
||||||
|
if [ "${DELETE_INCOMPLETE}" = "yes" ]; then
|
||||||
|
_techo "Deleting ${realincomplete} ..."
|
||||||
|
pcmd rm $VVERBOSE -rf "${ddir}/${realincomplete}" || \
|
||||||
|
_exit_err "Removing ${realincomplete} failed."
|
||||||
|
fi
|
||||||
|
j=$(($j+1))
|
||||||
|
done
|
||||||
|
|
||||||
|
#
|
||||||
|
# check if maximum number of backups is reached, if so remove
|
||||||
|
# use grep and ls -p so we only look at directories
|
||||||
|
#
|
||||||
|
count="$(pcmd ls -p1 "${ddir}" | grep "^${INTERVAL}\..*/\$" | wc -l \
|
||||||
|
| sed 's/^ *//g')" || _exit_err "Counting backups failed"
|
||||||
|
|
||||||
|
_techo "Existing backups: ${count} Total keeping backups: ${c_interval}"
|
||||||
|
|
||||||
|
if [ "${count}" -ge "${c_interval}" ]; then
|
||||||
|
substract=$((${c_interval} - 1))
|
||||||
|
remove=$((${count} - ${substract}))
|
||||||
|
_techo "Removing ${remove} backup(s)..."
|
||||||
|
|
||||||
|
pcmd ls -${TSORT}p1r "$ddir" | grep "^${INTERVAL}\..*/\$" | \
|
||||||
|
head -n "${remove}" > "${TMP}" || \
|
||||||
|
_exit_err "Listing old backups failed"
|
||||||
|
|
||||||
|
i=0
|
||||||
|
while read to_remove; do
|
||||||
|
eval remove_$i=\"${to_remove}\"
|
||||||
|
i=$(($i+1))
|
||||||
|
done < "${TMP}"
|
||||||
|
|
||||||
|
j=0
|
||||||
|
while [ "$j" -lt "$i" ]; do
|
||||||
|
eval to_remove=\"\$remove_$j\"
|
||||||
|
_techo "Removing ${to_remove} ..."
|
||||||
|
pcmd rm ${VVERBOSE} -rf "${ddir}/${to_remove}" || \
|
||||||
|
_exit_err "Removing ${to_remove} failed."
|
||||||
|
j=$(($j+1))
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check for backup directory to clone from: Always clone from the latest one!
|
||||||
|
#
|
||||||
|
# Depending on your file system, you may want to sort on:
|
||||||
|
# 1. mtime (modification time) with TSORT=t, or
|
||||||
|
# 2. ctime (last change time, usually) with TSORT=tc
|
||||||
|
last_dir="$(pcmd ls -${TSORT}p1 "${ddir}" | grep '/$' | head -n 1)" || \
|
||||||
|
_exit_err "Failed to list contents of ${ddir}."
|
||||||
|
|
||||||
|
#
|
||||||
|
# clone from old backup, if existing
|
||||||
|
#
|
||||||
|
if [ "${last_dir}" ]; then
|
||||||
|
set -- "$@" "--link-dest=${ddir}/${last_dir}"
|
||||||
|
_techo "Hard linking from ${last_dir}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
|
||||||
|
# set time when we really begin to backup, not when we began to remove above
|
||||||
|
destination_date=$(${CDATE})
|
||||||
|
destination_dir="${ddir}/${INTERVAL}.${destination_date}.$$"
|
||||||
|
destination_full="${destination}/${INTERVAL}.${destination_date}.$$"
|
||||||
|
|
||||||
|
# give some info
|
||||||
|
_techo "Beginning to backup, this may take some time..."
|
||||||
|
|
||||||
|
_techo "Creating ${destination_dir} ..."
|
||||||
|
pcmd mkdir ${VVERBOSE} "${destination_dir}" || \
|
||||||
|
_exit_err "Creating ${destination_dir} failed. Skipping."
|
||||||
|
|
||||||
|
#
|
||||||
|
# added marking in 0.6 (and remove it, if successful later)
|
||||||
|
#
|
||||||
|
pcmd touch "${destination_dir}.${c_marker}"
|
||||||
|
|
||||||
|
#
|
||||||
|
# the rsync part
|
||||||
|
#
|
||||||
|
_techo "Transferring files..."
|
||||||
|
rsync "$@" "${source}" "${destination_full}"; ret=$?
|
||||||
|
# Correct the modification time:
|
||||||
|
pcmd touch "${destination_dir}"
|
||||||
|
|
||||||
|
#
|
||||||
|
# remove marking here
|
||||||
|
#
|
||||||
|
if [ "$ret" -ne 12 ] ; then
|
||||||
|
pcmd rm "${destination_dir}.${c_marker}" || \
|
||||||
|
_exit_err "Removing ${destination_dir}/${c_marker} failed."
|
||||||
|
fi
|
||||||
|
|
||||||
|
_techo "Finished backup (rsync return code: $ret)."
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_techo "Warning: rsync exited non-zero, the backup may be broken (see rsync errors)."
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# post_exec
|
||||||
|
#
|
||||||
|
if [ -x "${c_post_exec}" ]; then
|
||||||
|
_techo "Executing ${c_post_exec} ..."
|
||||||
|
"${c_post_exec}"; ret=$?
|
||||||
|
_techo "Finished ${c_post_exec}."
|
||||||
|
|
||||||
|
if [ ${ret} -ne 0 ]; then
|
||||||
|
_exit_err "${c_post_exec} failed."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Calculation
|
||||||
|
end_s=$(date +%s)
|
||||||
|
|
||||||
|
full_seconds=$((${end_s} - ${begin_s}))
|
||||||
|
hours=$((${full_seconds} / 3600))
|
||||||
|
seconds=$((${full_seconds} - (${hours} * 3600)))
|
||||||
|
minutes=$((${seconds} / 60))
|
||||||
|
seconds=$((${seconds} - (${minutes} * 60)))
|
||||||
|
|
||||||
|
_techo "Backup lasted: ${hours}:${minutes}:${seconds} (h:m:s)"
|
||||||
|
|
||||||
|
) | add_name
|
||||||
|
done
|
||||||
|
|
||||||
|
#
|
||||||
|
# Be a good parent and wait for our children, if they are running wild parallel
|
||||||
|
#
|
||||||
|
if [ "${PARALLEL}" ]; then
|
||||||
|
_techo "Waiting for children to complete..."
|
||||||
|
wait
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Look for post-exec command (general)
|
||||||
|
#
|
||||||
|
if [ -x "${CPOSTEXEC}" ]; then
|
||||||
|
_techo "Executing ${CPOSTEXEC} ..."
|
||||||
|
"${CPOSTEXEC}"; ret=$?
|
||||||
|
_techo "Finished ${CPOSTEXEC} (return code: ${ret})."
|
||||||
|
|
||||||
|
if [ ${ret} -ne 0 ]; then
|
||||||
|
_techo "${CPOSTEXEC} failed."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
rm -f "${TMP}"
|
||||||
|
_techo "Finished ${WE}"
|
||||||
|
|
||||||
|
# vim: set shiftwidth=3 tabstop=3 expandtab :
|
||||||
663
contrib/jlawless-2009-06-03/ccollect-i.sh
Executable file
663
contrib/jlawless-2009-06-03/ccollect-i.sh
Executable file
|
|
@ -0,0 +1,663 @@
|
||||||
|
#!/bin/sh
|
||||||
|
#
|
||||||
|
# 2005-2009 Nico Schottelius (nico-ccollect at schottelius.org)
|
||||||
|
#
|
||||||
|
# This file is part of ccollect.
|
||||||
|
#
|
||||||
|
# ccollect is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# ccollect is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with ccollect. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
#
|
||||||
|
# Initially written for SyGroup (www.sygroup.ch)
|
||||||
|
# Date: Mon Nov 14 11:45:11 CET 2005
|
||||||
|
|
||||||
|
#
|
||||||
|
# Standard variables (stolen from cconf)
|
||||||
|
#
|
||||||
|
__pwd="$(pwd -P)"
|
||||||
|
__mydir="${0%/*}"; __abs_mydir="$(cd "$__mydir" && pwd -P)"
|
||||||
|
__myname=${0##*/}; __abs_myname="$__abs_mydir/$__myname"
|
||||||
|
|
||||||
|
#
|
||||||
|
# where to find our configuration and temporary file
|
||||||
|
#
|
||||||
|
CCOLLECT_CONF=${CCOLLECT_CONF:-/etc/ccollect}
|
||||||
|
CSOURCES=${CCOLLECT_CONF}/sources
|
||||||
|
CDEFAULTS=${CCOLLECT_CONF}/defaults
|
||||||
|
CPREEXEC="${CDEFAULTS}/pre_exec"
|
||||||
|
CPOSTEXEC="${CDEFAULTS}/post_exec"
|
||||||
|
|
||||||
|
TMP=$(mktemp "/tmp/${__myname}.XXXXXX")
|
||||||
|
VERSION=0.7.1
|
||||||
|
RELEASE="2009-02-02"
|
||||||
|
HALF_VERSION="ccollect ${VERSION}"
|
||||||
|
FULL_VERSION="ccollect ${VERSION} (${RELEASE})"
|
||||||
|
|
||||||
|
#TSORT="tc" ; NEWER="cnewer"
|
||||||
|
TSORT="t" ; NEWER="newer"
|
||||||
|
|
||||||
|
#
|
||||||
|
# CDATE: how we use it for naming of the archives
|
||||||
|
# DDATE: how the user should see it in our output (DISPLAY)
|
||||||
|
#
|
||||||
|
CDATE="date +%Y%m%d-%H%M"
|
||||||
|
DDATE="date +%Y-%m-%d-%H:%M:%S"
|
||||||
|
|
||||||
|
#
|
||||||
|
# unset parallel execution
|
||||||
|
#
|
||||||
|
PARALLEL=""
|
||||||
|
|
||||||
|
#
|
||||||
|
# catch signals
|
||||||
|
#
|
||||||
|
trap "rm -f \"${TMP}\"" 1 2 15
|
||||||
|
|
||||||
|
#
|
||||||
|
# Functions
|
||||||
|
#
|
||||||
|
|
||||||
|
# time displaying echo
|
||||||
|
_techo()
|
||||||
|
{
|
||||||
|
echo "$(${DDATE}): $@"
|
||||||
|
}
|
||||||
|
|
||||||
|
# exit on error
|
||||||
|
_exit_err()
|
||||||
|
{
|
||||||
|
_techo "$@"
|
||||||
|
rm -f "${TMP}"
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
|
||||||
|
add_name()
|
||||||
|
{
|
||||||
|
awk "{ print \"[${name}] \" \$0 }"
|
||||||
|
}
|
||||||
|
|
||||||
|
pcmd()
|
||||||
|
{
|
||||||
|
if [ "$remote_host" ]; then
|
||||||
|
ssh "$remote_host" "$@"
|
||||||
|
else
|
||||||
|
"$@"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# Version
|
||||||
|
#
|
||||||
|
display_version()
|
||||||
|
{
|
||||||
|
echo "${FULL_VERSION}"
|
||||||
|
exit 0
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# Tell how to use us
|
||||||
|
#
|
||||||
|
usage()
|
||||||
|
{
|
||||||
|
echo "${__myname}: <interval name> [args] <sources to backup>"
|
||||||
|
echo ""
|
||||||
|
echo " ccollect creates (pseudo) incremental backups"
|
||||||
|
echo ""
|
||||||
|
echo " -h, --help: Show this help screen"
|
||||||
|
echo " -p, --parallel: Parallelise backup processes"
|
||||||
|
echo " -a, --all: Backup all sources specified in ${CSOURCES}"
|
||||||
|
echo " -v, --verbose: Be very verbose (uses set -x)"
|
||||||
|
echo " -V, --version: Print version information"
|
||||||
|
echo ""
|
||||||
|
echo " This is version ${VERSION}, released on ${RELEASE}"
|
||||||
|
echo " (the first version was written on 2005-12-05 by Nico Schottelius)."
|
||||||
|
echo ""
|
||||||
|
echo " Retrieve latest ccollect at http://unix.schottelius.org/ccollect/"
|
||||||
|
exit 0
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# Select interval if AUTO
|
||||||
|
#
|
||||||
|
# For this to work nicely, you have to choose interval names that sort nicely
|
||||||
|
# such as int1, int2, int3 or a_daily, b_weekly, c_monthly, etc.
|
||||||
|
#
|
||||||
|
auto_interval()
|
||||||
|
{
|
||||||
|
if [ -d "${backup}/intervals" -a -n "$(ls "${backup}/intervals" 2>/dev/null)" ] ; then
|
||||||
|
intervals_dir="${backup}/intervals"
|
||||||
|
elif [ -d "${CDEFAULTS}/intervals" -a -n "$(ls "${CDEFAULTS}/intervals" 2>/dev/null)" ] ; then
|
||||||
|
intervals_dir="${CDEFAULTS}/intervals"
|
||||||
|
else
|
||||||
|
_exit_err "No intervals are defined. Skipping."
|
||||||
|
fi
|
||||||
|
echo intervals_dir=${intervals_dir}
|
||||||
|
|
||||||
|
trial_interval="$(ls -1r "${intervals_dir}/" | head -n 1)" || \
|
||||||
|
_exit_err "Failed to list contents of ${intervals_dir}/."
|
||||||
|
_techo "Considering interval ${trial_interval}"
|
||||||
|
most_recent="$(pcmd ls -${TSORT}p1 "${ddir}" | grep "^${trial_interval}.*/$" | head -n 1)" || \
|
||||||
|
_exit_err "Failed to list contents of ${ddir}/."
|
||||||
|
_techo " Most recent ${trial_interval}: '${most_recent}'"
|
||||||
|
if [ -n "${most_recent}" ]; then
|
||||||
|
no_intervals="$(ls -1 "${intervals_dir}/" | wc -l)"
|
||||||
|
n=1
|
||||||
|
while [ "${n}" -le "${no_intervals}" ]; do
|
||||||
|
trial_interval="$(ls -p1 "${intervals_dir}/" | tail -n+${n} | head -n 1)"
|
||||||
|
_techo "Considering interval '${trial_interval}'"
|
||||||
|
c_interval="$(cat "${intervals_dir}/${trial_interval}" 2>/dev/null)"
|
||||||
|
m=$((${n}+1))
|
||||||
|
set -- "${ddir}" -maxdepth 1
|
||||||
|
while [ "${m}" -le "${no_intervals}" ]; do
|
||||||
|
interval_m="$(ls -1 "${intervals_dir}/" | tail -n+${m} | head -n 1)"
|
||||||
|
most_recent="$(pcmd ls -${TSORT}p1 "${ddir}" | grep "^${interval_m}\..*/$" | head -n 1)"
|
||||||
|
_techo " Most recent ${interval_m}: '${most_recent}'"
|
||||||
|
if [ -n "${most_recent}" ] ; then
|
||||||
|
set -- "$@" -$NEWER "${ddir}/${most_recent}"
|
||||||
|
fi
|
||||||
|
m=$((${m}+1))
|
||||||
|
done
|
||||||
|
count=$(pcmd find "$@" -iname "${trial_interval}*" | wc -l)
|
||||||
|
_techo " Found $count more recent backups of ${trial_interval} (limit: ${c_interval})"
|
||||||
|
if [ "$count" -lt "${c_interval}" ] ; then
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
n=$((${n}+1))
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
export INTERVAL="${trial_interval}"
|
||||||
|
D_FILE_INTERVAL="${intervals_dir}/${INTERVAL}"
|
||||||
|
D_INTERVAL=$(cat "${D_FILE_INTERVAL}" 2>/dev/null)
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# need at least interval and one source or --all
|
||||||
|
#
|
||||||
|
if [ $# -lt 2 ]; then
|
||||||
|
if [ "$1" = "-V" -o "$1" = "--version" ]; then
|
||||||
|
display_version
|
||||||
|
else
|
||||||
|
usage
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# check for configuraton directory
|
||||||
|
#
|
||||||
|
[ -d "${CCOLLECT_CONF}" ] || _exit_err "No configuration found in " \
|
||||||
|
"\"${CCOLLECT_CONF}\" (is \$CCOLLECT_CONF properly set?)"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Filter arguments
|
||||||
|
#
|
||||||
|
export INTERVAL="$1"; shift
|
||||||
|
i=1
|
||||||
|
no_sources=0
|
||||||
|
|
||||||
|
#
|
||||||
|
# Create source "array"
|
||||||
|
#
|
||||||
|
while [ "$#" -ge 1 ]; do
|
||||||
|
eval arg=\"\$1\"; shift
|
||||||
|
|
||||||
|
if [ "${NO_MORE_ARGS}" = 1 ]; then
|
||||||
|
eval source_${no_sources}=\"${arg}\"
|
||||||
|
no_sources=$((${no_sources}+1))
|
||||||
|
|
||||||
|
# make variable available for subscripts
|
||||||
|
eval export source_${no_sources}
|
||||||
|
else
|
||||||
|
case "${arg}" in
|
||||||
|
-a|--all)
|
||||||
|
ALL=1
|
||||||
|
;;
|
||||||
|
-v|--verbose)
|
||||||
|
VERBOSE=1
|
||||||
|
;;
|
||||||
|
-p|--parallel)
|
||||||
|
PARALLEL=1
|
||||||
|
;;
|
||||||
|
-h|--help)
|
||||||
|
usage
|
||||||
|
;;
|
||||||
|
--)
|
||||||
|
NO_MORE_ARGS=1
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
eval source_${no_sources}=\"$arg\"
|
||||||
|
no_sources=$(($no_sources+1))
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
fi
|
||||||
|
|
||||||
|
i=$(($i+1))
|
||||||
|
done
|
||||||
|
|
||||||
|
# also export number of sources
|
||||||
|
export no_sources
|
||||||
|
|
||||||
|
#
|
||||||
|
# be really, really, really verbose
|
||||||
|
#
|
||||||
|
if [ "${VERBOSE}" = 1 ]; then
|
||||||
|
set -x
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Look, if we should take ALL sources
|
||||||
|
#
|
||||||
|
if [ "${ALL}" = 1 ]; then
|
||||||
|
# reset everything specified before
|
||||||
|
no_sources=0
|
||||||
|
|
||||||
|
#
|
||||||
|
# get entries from sources
|
||||||
|
#
|
||||||
|
cwd=$(pwd -P)
|
||||||
|
( cd "${CSOURCES}" && ls > "${TMP}" ); ret=$?
|
||||||
|
|
||||||
|
[ "${ret}" -eq 0 ] || _exit_err "Listing of sources failed. Aborting."
|
||||||
|
|
||||||
|
while read tmp; do
|
||||||
|
eval source_${no_sources}=\"${tmp}\"
|
||||||
|
no_sources=$((${no_sources}+1))
|
||||||
|
done < "${TMP}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Need at least ONE source to backup
|
||||||
|
#
|
||||||
|
if [ "${no_sources}" -lt 1 ]; then
|
||||||
|
usage
|
||||||
|
else
|
||||||
|
_techo "${HALF_VERSION}: Beginning backup using interval ${INTERVAL}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Look for pre-exec command (general)
|
||||||
|
#
|
||||||
|
if [ -x "${CPREEXEC}" ]; then
|
||||||
|
_techo "Executing ${CPREEXEC} ..."
|
||||||
|
"${CPREEXEC}"; ret=$?
|
||||||
|
_techo "Finished ${CPREEXEC} (return code: ${ret})."
|
||||||
|
|
||||||
|
[ "${ret}" -eq 0 ] || _exit_err "${CPREEXEC} failed. Aborting"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# check default configuration
|
||||||
|
#
|
||||||
|
|
||||||
|
D_FILE_INTERVAL="${CDEFAULTS}/intervals/${INTERVAL}"
|
||||||
|
D_INTERVAL=$(cat "${D_FILE_INTERVAL}" 2>/dev/null)
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Let's do the backup
|
||||||
|
#
|
||||||
|
i=0
|
||||||
|
while [ "${i}" -lt "${no_sources}" ]; do
|
||||||
|
|
||||||
|
#
|
||||||
|
# Get current source
|
||||||
|
#
|
||||||
|
eval name=\"\$source_${i}\"
|
||||||
|
i=$((${i}+1))
|
||||||
|
|
||||||
|
export name
|
||||||
|
|
||||||
|
#
|
||||||
|
# start ourself, if we want parallel execution
|
||||||
|
#
|
||||||
|
if [ "${PARALLEL}" ]; then
|
||||||
|
"$0" "${INTERVAL}" "${name}" &
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Start subshell for easy log editing
|
||||||
|
#
|
||||||
|
(
|
||||||
|
#
|
||||||
|
# Stderr to stdout, so we can produce nice logs
|
||||||
|
#
|
||||||
|
exec 2>&1
|
||||||
|
|
||||||
|
#
|
||||||
|
# Configuration
|
||||||
|
#
|
||||||
|
backup="${CSOURCES}/${name}"
|
||||||
|
c_source="${backup}/source"
|
||||||
|
c_dest="${backup}/destination"
|
||||||
|
c_pre_exec="${backup}/pre_exec"
|
||||||
|
c_post_exec="${backup}/post_exec"
|
||||||
|
for opt in exclude verbose very_verbose rsync_options summary delete_incomplete remote_host ; do
|
||||||
|
if [ -f "${backup}/$opt" -o -f "${backup}/no_$opt" ]; then
|
||||||
|
eval c_$opt=\"${backup}/$opt\"
|
||||||
|
else
|
||||||
|
eval c_$opt=\"${CDEFAULTS}/$opt\"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
#
|
||||||
|
# Marking backups: If we abort it's not removed => Backup is broken
|
||||||
|
#
|
||||||
|
c_marker=".ccollect-marker"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Times
|
||||||
|
#
|
||||||
|
begin_s=$(date +%s)
|
||||||
|
|
||||||
|
#
|
||||||
|
# unset possible options
|
||||||
|
#
|
||||||
|
VERBOSE=""
|
||||||
|
VVERBOSE=""
|
||||||
|
|
||||||
|
_techo "Beginning to backup"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Standard configuration checks
|
||||||
|
#
|
||||||
|
if [ ! -e "${backup}" ]; then
|
||||||
|
_exit_err "Source does not exist."
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# configuration _must_ be a directory
|
||||||
|
#
|
||||||
|
if [ ! -d "${backup}" ]; then
|
||||||
|
_exit_err "\"${name}\" is not a cconfig-directory. Skipping."
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# first execute pre_exec, which may generate destination or other
|
||||||
|
# parameters
|
||||||
|
#
|
||||||
|
if [ -x "${c_pre_exec}" ]; then
|
||||||
|
_techo "Executing ${c_pre_exec} ..."
|
||||||
|
"${c_pre_exec}"; ret="$?"
|
||||||
|
_techo "Finished ${c_pre_exec} (return code ${ret})."
|
||||||
|
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_exit_err "${c_pre_exec} failed. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Destination is a path
|
||||||
|
#
|
||||||
|
if [ ! -f "${c_dest}" ]; then
|
||||||
|
_exit_err "Destination ${c_dest} is not a file. Skipping."
|
||||||
|
else
|
||||||
|
ddir=$(cat "${c_dest}"); ret="$?"
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_exit_err "Destination ${c_dest} is not readable. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# interval definition: First try source specific, fallback to default
|
||||||
|
#
|
||||||
|
if [ "${INTERVAL}" = "AUTO" ] ; then
|
||||||
|
auto_interval
|
||||||
|
_techo "Selected interval: '$INTERVAL'"
|
||||||
|
fi
|
||||||
|
c_interval="$(cat "${backup}/intervals/${INTERVAL}" 2>/dev/null)"
|
||||||
|
|
||||||
|
if [ -z "${c_interval}" ]; then
|
||||||
|
c_interval="${D_INTERVAL}"
|
||||||
|
|
||||||
|
if [ -z "${c_interval}" ]; then
|
||||||
|
_exit_err "No definition for interval \"${INTERVAL}\" found. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Source checks
|
||||||
|
#
|
||||||
|
if [ ! -f "${c_source}" ]; then
|
||||||
|
_exit_err "Source description \"${c_source}\" is not a file. Skipping."
|
||||||
|
else
|
||||||
|
source=$(cat "${c_source}"); ret="$?"
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_exit_err "Source ${c_source} is not readable. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
# Verify source is up and accepting connections before deleting any old backups
|
||||||
|
rsync "$source" >/dev/null || _exit_err "Source ${source} is not readable. Skipping."
|
||||||
|
|
||||||
|
#
|
||||||
|
# do we backup to a remote host? then set pre-cmd
|
||||||
|
#
|
||||||
|
if [ -f "${c_remote_host}" ]; then
|
||||||
|
# adjust ls and co
|
||||||
|
remote_host=$(cat "${c_remote_host}"); ret="$?"
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_exit_err "Remote host file ${c_remote_host} exists, but is not readable. Skipping."
|
||||||
|
fi
|
||||||
|
destination="${remote_host}:${ddir}"
|
||||||
|
else
|
||||||
|
remote_host=""
|
||||||
|
destination="${ddir}"
|
||||||
|
fi
|
||||||
|
export remote_host
|
||||||
|
|
||||||
|
#
|
||||||
|
# check for existence / use real name
|
||||||
|
#
|
||||||
|
( pcmd cd "$ddir" ) || _exit_err "Cannot change to ${ddir}. Skipping."
|
||||||
|
|
||||||
|
|
||||||
|
# NEW method as of 0.6:
|
||||||
|
# - insert ccollect default parameters
|
||||||
|
# - insert options
|
||||||
|
# - insert user options
|
||||||
|
|
||||||
|
#
|
||||||
|
# rsync standard options
|
||||||
|
#
|
||||||
|
|
||||||
|
set -- "$@" "--archive" "--delete" "--numeric-ids" "--relative" \
|
||||||
|
"--delete-excluded" "--sparse"
|
||||||
|
|
||||||
|
#
|
||||||
|
# exclude list
|
||||||
|
#
|
||||||
|
if [ -f "${c_exclude}" ]; then
|
||||||
|
set -- "$@" "--exclude-from=${c_exclude}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Output a summary
|
||||||
|
#
|
||||||
|
if [ -f "${c_summary}" ]; then
|
||||||
|
set -- "$@" "--stats"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Verbosity for rsync
|
||||||
|
#
|
||||||
|
if [ -f "${c_very_verbose}" ]; then
|
||||||
|
set -- "$@" "-vv"
|
||||||
|
elif [ -f "${c_verbose}" ]; then
|
||||||
|
set -- "$@" "-v"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# extra options for rsync provided by the user
|
||||||
|
#
|
||||||
|
if [ -f "${c_rsync_options}" ]; then
|
||||||
|
while read line; do
|
||||||
|
set -- "$@" "$line"
|
||||||
|
done < "${c_rsync_options}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check for incomplete backups
|
||||||
|
#
|
||||||
|
pcmd ls -1 "$ddir/${INTERVAL}"*".${c_marker}" 2>/dev/null | while read marker; do
|
||||||
|
incomplete="$(echo ${marker} | sed "s/\\.${c_marker}\$//")"
|
||||||
|
_techo "Incomplete backup: ${incomplete}"
|
||||||
|
if [ -f "${c_delete_incomplete}" ]; then
|
||||||
|
_techo "Deleting ${incomplete} ..."
|
||||||
|
pcmd rm $VVERBOSE -rf "${incomplete}" || \
|
||||||
|
_exit_err "Removing ${incomplete} failed."
|
||||||
|
pcmd rm $VVERBOSE -f "${marker}" || \
|
||||||
|
_exit_err "Removing ${marker} failed."
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
#
|
||||||
|
# check if maximum number of backups is reached, if so remove
|
||||||
|
# use grep and ls -p so we only look at directories
|
||||||
|
#
|
||||||
|
count="$(pcmd ls -p1 "${ddir}" | grep "^${INTERVAL}\..*/\$" | wc -l \
|
||||||
|
| sed 's/^ *//g')" || _exit_err "Counting backups failed"
|
||||||
|
|
||||||
|
_techo "Existing backups: ${count} Total keeping backups: ${c_interval}"
|
||||||
|
|
||||||
|
if [ "${count}" -ge "${c_interval}" ]; then
|
||||||
|
substract=$((${c_interval} - 1))
|
||||||
|
remove=$((${count} - ${substract}))
|
||||||
|
_techo "Removing ${remove} backup(s)..."
|
||||||
|
|
||||||
|
pcmd ls -${TSORT}p1r "$ddir" | grep "^${INTERVAL}\..*/\$" | \
|
||||||
|
head -n "${remove}" > "${TMP}" || \
|
||||||
|
_exit_err "Listing old backups failed"
|
||||||
|
|
||||||
|
i=0
|
||||||
|
while read to_remove; do
|
||||||
|
eval remove_$i=\"${to_remove}\"
|
||||||
|
i=$(($i+1))
|
||||||
|
done < "${TMP}"
|
||||||
|
|
||||||
|
j=0
|
||||||
|
while [ "$j" -lt "$i" ]; do
|
||||||
|
eval to_remove=\"\$remove_$j\"
|
||||||
|
_techo "Removing ${to_remove} ..."
|
||||||
|
pcmd rm ${VVERBOSE} -rf "${ddir}/${to_remove}" || \
|
||||||
|
_exit_err "Removing ${to_remove} failed."
|
||||||
|
j=$(($j+1))
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check for backup directory to clone from: Always clone from the latest one!
|
||||||
|
#
|
||||||
|
# Depending on your file system, you may want to sort on:
|
||||||
|
# 1. mtime (modification time) with TSORT=t, or
|
||||||
|
# 2. ctime (last change time, usually) with TSORT=tc
|
||||||
|
last_dir="$(pcmd ls -${TSORT}p1 "${ddir}" | grep '/$' | head -n 1)" || \
|
||||||
|
_exit_err "Failed to list contents of ${ddir}."
|
||||||
|
|
||||||
|
#
|
||||||
|
# clone from old backup, if existing
|
||||||
|
#
|
||||||
|
if [ "${last_dir}" ]; then
|
||||||
|
set -- "$@" "--link-dest=${ddir}/${last_dir}"
|
||||||
|
_techo "Hard linking from ${last_dir}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
|
||||||
|
# set time when we really begin to backup, not when we began to remove above
|
||||||
|
destination_date=$(${CDATE})
|
||||||
|
destination_dir="${ddir}/${INTERVAL}.${destination_date}.$$"
|
||||||
|
destination_full="${destination}/${INTERVAL}.${destination_date}.$$"
|
||||||
|
|
||||||
|
# give some info
|
||||||
|
_techo "Beginning to backup, this may take some time..."
|
||||||
|
|
||||||
|
_techo "Creating ${destination_dir} ..."
|
||||||
|
pcmd mkdir ${VVERBOSE} "${destination_dir}" || \
|
||||||
|
_exit_err "Creating ${destination_dir} failed. Skipping."
|
||||||
|
|
||||||
|
#
|
||||||
|
# added marking in 0.6 (and remove it, if successful later)
|
||||||
|
#
|
||||||
|
pcmd touch "${destination_dir}.${c_marker}"
|
||||||
|
|
||||||
|
#
|
||||||
|
# the rsync part
|
||||||
|
#
|
||||||
|
_techo "Transferring files..."
|
||||||
|
rsync "$@" "${source}" "${destination_full}"; ret=$?
|
||||||
|
# Correct the modification time:
|
||||||
|
pcmd touch "${destination_dir}"
|
||||||
|
|
||||||
|
#
|
||||||
|
# remove marking here
|
||||||
|
#
|
||||||
|
if [ "$ret" -ne 12 ] ; then
|
||||||
|
pcmd rm "${destination_dir}.${c_marker}" || \
|
||||||
|
_exit_err "Removing ${destination_dir}/${c_marker} failed."
|
||||||
|
fi
|
||||||
|
|
||||||
|
_techo "Finished backup (rsync return code: $ret)."
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_techo "Warning: rsync exited non-zero, the backup may be broken (see rsync errors)."
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# post_exec
|
||||||
|
#
|
||||||
|
if [ -x "${c_post_exec}" ]; then
|
||||||
|
_techo "Executing ${c_post_exec} ..."
|
||||||
|
"${c_post_exec}"; ret=$?
|
||||||
|
_techo "Finished ${c_post_exec}."
|
||||||
|
|
||||||
|
if [ ${ret} -ne 0 ]; then
|
||||||
|
_exit_err "${c_post_exec} failed."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Calculation
|
||||||
|
end_s=$(date +%s)
|
||||||
|
|
||||||
|
full_seconds=$((${end_s} - ${begin_s}))
|
||||||
|
hours=$((${full_seconds} / 3600))
|
||||||
|
seconds=$((${full_seconds} - (${hours} * 3600)))
|
||||||
|
minutes=$((${seconds} / 60))
|
||||||
|
seconds=$((${seconds} - (${minutes} * 60)))
|
||||||
|
|
||||||
|
_techo "Backup lasted: ${hours}:${minutes}:${seconds} (h:m:s)"
|
||||||
|
|
||||||
|
) | add_name
|
||||||
|
done
|
||||||
|
|
||||||
|
#
|
||||||
|
# Be a good parent and wait for our children, if they are running wild parallel
|
||||||
|
#
|
||||||
|
if [ "${PARALLEL}" ]; then
|
||||||
|
_techo "Waiting for children to complete..."
|
||||||
|
wait
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Look for post-exec command (general)
|
||||||
|
#
|
||||||
|
if [ -x "${CPOSTEXEC}" ]; then
|
||||||
|
_techo "Executing ${CPOSTEXEC} ..."
|
||||||
|
"${CPOSTEXEC}"; ret=$?
|
||||||
|
_techo "Finished ${CPOSTEXEC} (return code: ${ret})."
|
||||||
|
|
||||||
|
if [ ${ret} -ne 0 ]; then
|
||||||
|
_techo "${CPOSTEXEC} failed."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
rm -f "${TMP}"
|
||||||
|
_techo "Finished ${WE}"
|
||||||
|
|
||||||
|
# vim: set shiftwidth=3 tabstop=3 expandtab :
|
||||||
74
contrib/jlawless-2009-06-03/g.patch
Normal file
74
contrib/jlawless-2009-06-03/g.patch
Normal file
|
|
@ -0,0 +1,74 @@
|
||||||
|
# I found that ccollect was not deleting incomplete backups despite the
|
||||||
|
# delete_incomplete option being specified. I traced the problem to:
|
||||||
|
#
|
||||||
|
# < pcmd rm $VVERBOSE -rf "${ddir}/${realincomplete}" || \
|
||||||
|
#
|
||||||
|
# which, at least on all the systems I tested, should read:
|
||||||
|
#
|
||||||
|
# > pcmd rm $VVERBOSE -rf "${realincomplete}" || \
|
||||||
|
#
|
||||||
|
# Also, the marker file is not deleted. I didn't see any reason to keep
|
||||||
|
# those files around (what do you think?), so I deleted them also:
|
||||||
|
#
|
||||||
|
# > pcmd rm $VVERBOSE -rf "${ddir}/${realincomplete}" || \
|
||||||
|
# > _exit_err "Removing ${realincomplete} failed."
|
||||||
|
#
|
||||||
|
# As long as I was messing with the delete incomplete code and therefore need
|
||||||
|
# to test it, I took the liberty of simplifying it. The v0.7.1 code uses
|
||||||
|
# multiple loops with multiple loop counters and creates many variables. I
|
||||||
|
# simplified that to a single loop:
|
||||||
|
#
|
||||||
|
# > pcmd ls -1 "$ddir/${INTERVAL}"*".${c_marker}" 2>/dev/null | while read marker; do
|
||||||
|
# > incomplete="$(echo ${marker} | sed "s/\\.${c_marker}\$//")"
|
||||||
|
# > _techo "Incomplete backup: ${incomplete}"
|
||||||
|
# > if [ "${DELETE_INCOMPLETE}" = "yes" ]; then
|
||||||
|
# > _techo "Deleting ${incomplete} ..."
|
||||||
|
# > pcmd rm $VVERBOSE -rf "${incomplete}" || \
|
||||||
|
# > _exit_err "Removing ${incomplete} failed."
|
||||||
|
# > pcmd rm $VVERBOSE -f "${marker}" || \
|
||||||
|
# > _exit_err "Removing ${marker} failed."
|
||||||
|
# > fi
|
||||||
|
# > done
|
||||||
|
#
|
||||||
|
# The final code (a) fixes the delete bug, (b) also deletes the marker, and
|
||||||
|
# (c) is eight lines shorter than the original.
|
||||||
|
#
|
||||||
|
--- ccollect-f.sh 2009-05-12 12:49:28.000000000 -0700
|
||||||
|
+++ ccollect-g.sh 2009-06-03 14:32:03.000000000 -0700
|
||||||
|
@@ -516,28 +516,20 @@
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check for incomplete backups
|
||||||
|
#
|
||||||
|
- pcmd ls -1 "$ddir/${INTERVAL}"*".${c_marker}" > "${TMP}" 2>/dev/null
|
||||||
|
-
|
||||||
|
- i=0
|
||||||
|
- while read incomplete; do
|
||||||
|
- eval incomplete_$i=\"$(echo ${incomplete} | sed "s/\\.${c_marker}\$//")\"
|
||||||
|
- i=$(($i+1))
|
||||||
|
- done < "${TMP}"
|
||||||
|
-
|
||||||
|
- j=0
|
||||||
|
- while [ "$j" -lt "$i" ]; do
|
||||||
|
- eval realincomplete=\"\$incomplete_$j\"
|
||||||
|
- _techo "Incomplete backup: ${realincomplete}"
|
||||||
|
+ pcmd ls -1 "$ddir/${INTERVAL}"*".${c_marker}" 2>/dev/null | while read marker; do
|
||||||
|
+ incomplete="$(echo ${marker} | sed "s/\\.${c_marker}\$//")"
|
||||||
|
+ _techo "Incomplete backup: ${incomplete}"
|
||||||
|
if [ "${DELETE_INCOMPLETE}" = "yes" ]; then
|
||||||
|
- _techo "Deleting ${realincomplete} ..."
|
||||||
|
- pcmd rm $VVERBOSE -rf "${ddir}/${realincomplete}" || \
|
||||||
|
- _exit_err "Removing ${realincomplete} failed."
|
||||||
|
+ _techo "Deleting ${incomplete} ..."
|
||||||
|
+ pcmd rm $VVERBOSE -rf "${incomplete}" || \
|
||||||
|
+ _exit_err "Removing ${incomplete} failed."
|
||||||
|
+ pcmd rm $VVERBOSE -f "${marker}" || \
|
||||||
|
+ _exit_err "Removing ${marker} failed."
|
||||||
|
fi
|
||||||
|
- j=$(($j+1))
|
||||||
|
done
|
||||||
|
|
||||||
|
#
|
||||||
|
# check if maximum number of backups is reached, if so remove
|
||||||
|
# use grep and ls -p so we only look at directories
|
||||||
18
contrib/jlawless-2009-06-03/h.patch
Normal file
18
contrib/jlawless-2009-06-03/h.patch
Normal file
|
|
@ -0,0 +1,18 @@
|
||||||
|
# A line in my f.patch was missing needed quotation marks.
|
||||||
|
# This fixes that.
|
||||||
|
#
|
||||||
|
--- ccollect-g.sh 2009-06-03 14:32:03.000000000 -0700
|
||||||
|
+++ ccollect-h.sh 2009-06-03 14:32:19.000000000 -0700
|
||||||
|
@@ -412,11 +412,11 @@
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# interval definition: First try source specific, fallback to default
|
||||||
|
#
|
||||||
|
- if [ ${INTERVAL} = "AUTO" ] ; then
|
||||||
|
+ if [ "${INTERVAL}" = "AUTO" ] ; then
|
||||||
|
auto_interval
|
||||||
|
_techo "Selected interval: '$INTERVAL'"
|
||||||
|
fi
|
||||||
|
c_interval="$(cat "${backup}/intervals/${INTERVAL}" 2>/dev/null)"
|
||||||
|
|
||||||
134
contrib/jlawless-2009-06-03/i.patch
Normal file
134
contrib/jlawless-2009-06-03/i.patch
Normal file
|
|
@ -0,0 +1,134 @@
|
||||||
|
# I have many sources that use the same options so I put those
|
||||||
|
# options in the defaults directory. I found that ccollect was
|
||||||
|
# ignoring most of them. I thought that this was a bug so I wrote
|
||||||
|
# some code to correct this:
|
||||||
|
#
|
||||||
|
# > for opt in exclude verbose very_verbose rsync_options summary delete_incomplete remote_host ; do
|
||||||
|
# > if [ -f "${backup}/$opt" -o -f "${backup}/no_$opt" ]; then
|
||||||
|
# > eval c_$opt=\"${backup}/$opt\"
|
||||||
|
# > else
|
||||||
|
# > eval c_$opt=\"${CDEFAULTS}/$opt\"
|
||||||
|
# > fi
|
||||||
|
# > done
|
||||||
|
#
|
||||||
|
# This also adds a new feature: if some option, say verbose, is
|
||||||
|
# specified in the defaults directory, it can be turned off for
|
||||||
|
# particular sources by specifying no_verbose as a source option.
|
||||||
|
#
|
||||||
|
# A side effect of this approach is that it forces script variable
|
||||||
|
# names to be consistent with option file names. Thus, there are
|
||||||
|
# several changes such as:
|
||||||
|
#
|
||||||
|
# < if [ -f "${c_rsync_extra}" ]; then
|
||||||
|
# > if [ -f "${c_rsync_options}" ]; then
|
||||||
|
#
|
||||||
|
# and
|
||||||
|
#
|
||||||
|
# < if [ -f "${c_vverbose}" ]; then
|
||||||
|
# > if [ -f "${c_very_verbose}" ]; then
|
||||||
|
#
|
||||||
|
# After correcting the bug and adding the "no_" feature, the code is
|
||||||
|
# 12 lines shorter.
|
||||||
|
#
|
||||||
|
--- ccollect-h.sh 2009-06-01 15:59:11.000000000 -0700
|
||||||
|
+++ ccollect-i.sh 2009-06-03 14:27:58.000000000 -0700
|
||||||
|
@@ -336,20 +336,19 @@
|
||||||
|
# Configuration
|
||||||
|
#
|
||||||
|
backup="${CSOURCES}/${name}"
|
||||||
|
c_source="${backup}/source"
|
||||||
|
c_dest="${backup}/destination"
|
||||||
|
- c_exclude="${backup}/exclude"
|
||||||
|
- c_verbose="${backup}/verbose"
|
||||||
|
- c_vverbose="${backup}/very_verbose"
|
||||||
|
- c_rsync_extra="${backup}/rsync_options"
|
||||||
|
- c_summary="${backup}/summary"
|
||||||
|
c_pre_exec="${backup}/pre_exec"
|
||||||
|
c_post_exec="${backup}/post_exec"
|
||||||
|
- f_incomplete="delete_incomplete"
|
||||||
|
- c_incomplete="${backup}/${f_incomplete}"
|
||||||
|
- c_remote_host="${backup}/remote_host"
|
||||||
|
+ for opt in exclude verbose very_verbose rsync_options summary delete_incomplete remote_host ; do
|
||||||
|
+ if [ -f "${backup}/$opt" -o -f "${backup}/no_$opt" ]; then
|
||||||
|
+ eval c_$opt=\"${backup}/$opt\"
|
||||||
|
+ else
|
||||||
|
+ eval c_$opt=\"${CDEFAULTS}/$opt\"
|
||||||
|
+ fi
|
||||||
|
+ done
|
||||||
|
|
||||||
|
#
|
||||||
|
# Marking backups: If we abort it's not removed => Backup is broken
|
||||||
|
#
|
||||||
|
c_marker=".ccollect-marker"
|
||||||
|
@@ -360,16 +359,12 @@
|
||||||
|
begin_s=$(date +%s)
|
||||||
|
|
||||||
|
#
|
||||||
|
# unset possible options
|
||||||
|
#
|
||||||
|
- EXCLUDE=""
|
||||||
|
- RSYNC_EXTRA=""
|
||||||
|
- SUMMARY=""
|
||||||
|
VERBOSE=""
|
||||||
|
VVERBOSE=""
|
||||||
|
- DELETE_INCOMPLETE=""
|
||||||
|
|
||||||
|
_techo "Beginning to backup"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Standard configuration checks
|
||||||
|
@@ -462,17 +457,10 @@
|
||||||
|
# check for existence / use real name
|
||||||
|
#
|
||||||
|
( pcmd cd "$ddir" ) || _exit_err "Cannot change to ${ddir}. Skipping."
|
||||||
|
|
||||||
|
|
||||||
|
- #
|
||||||
|
- # Check whether to delete incomplete backups
|
||||||
|
- #
|
||||||
|
- if [ -f "${c_incomplete}" -o -f "${CDEFAULTS}/${f_incomplete}" ]; then
|
||||||
|
- DELETE_INCOMPLETE="yes"
|
||||||
|
- fi
|
||||||
|
-
|
||||||
|
# NEW method as of 0.6:
|
||||||
|
# - insert ccollect default parameters
|
||||||
|
# - insert options
|
||||||
|
# - insert user options
|
||||||
|
|
||||||
|
@@ -498,32 +486,32 @@
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Verbosity for rsync
|
||||||
|
#
|
||||||
|
- if [ -f "${c_vverbose}" ]; then
|
||||||
|
+ if [ -f "${c_very_verbose}" ]; then
|
||||||
|
set -- "$@" "-vv"
|
||||||
|
elif [ -f "${c_verbose}" ]; then
|
||||||
|
set -- "$@" "-v"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# extra options for rsync provided by the user
|
||||||
|
#
|
||||||
|
- if [ -f "${c_rsync_extra}" ]; then
|
||||||
|
+ if [ -f "${c_rsync_options}" ]; then
|
||||||
|
while read line; do
|
||||||
|
set -- "$@" "$line"
|
||||||
|
- done < "${c_rsync_extra}"
|
||||||
|
+ done < "${c_rsync_options}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check for incomplete backups
|
||||||
|
#
|
||||||
|
pcmd ls -1 "$ddir/${INTERVAL}"*".${c_marker}" 2>/dev/null | while read marker; do
|
||||||
|
incomplete="$(echo ${marker} | sed "s/\\.${c_marker}\$//")"
|
||||||
|
_techo "Incomplete backup: ${incomplete}"
|
||||||
|
- if [ "${DELETE_INCOMPLETE}" = "yes" ]; then
|
||||||
|
+ if [ -f "${c_delete_incomplete}" ]; then
|
||||||
|
_techo "Deleting ${incomplete} ..."
|
||||||
|
pcmd rm $VVERBOSE -rf "${incomplete}" || \
|
||||||
|
_exit_err "Removing ${incomplete} failed."
|
||||||
|
pcmd rm $VVERBOSE -f "${marker}" || \
|
||||||
|
_exit_err "Removing ${marker} failed."
|
||||||
296
contrib/jlawless-2009-06-03/old/README_a-f.txt
Normal file
296
contrib/jlawless-2009-06-03/old/README_a-f.txt
Normal file
|
|
@ -0,0 +1,296 @@
|
||||||
|
Dear Nico Schottelius,
|
||||||
|
|
||||||
|
I have started using ccollect and I very much like its design:
|
||||||
|
it is elegant and effective.
|
||||||
|
|
||||||
|
In the process of getting ccollect setup and running, I made
|
||||||
|
five changes, including one major new feature, that I hope you will
|
||||||
|
find useful.
|
||||||
|
|
||||||
|
First, I added the following before any old backup gets deleted:
|
||||||
|
|
||||||
|
> # Verify source is up and accepting connections before deleting any old backups
|
||||||
|
> rsync "$source" >/dev/null || _exit_err "Source ${source} is not readable. Skipping."
|
||||||
|
|
||||||
|
I think that this quick test is a much better than, say, pinging
|
||||||
|
the source in a pre-exec script: this tests not only that the
|
||||||
|
source is up and connected to the net, it also verifies (1) that
|
||||||
|
ssh is up and accepting our key (if we are using ssh), and (2) that
|
||||||
|
the source directory is mounted (if it needs to be mounted) and
|
||||||
|
readable.
|
||||||
|
|
||||||
|
Second, I found ccollect's use of ctime problematic. After
|
||||||
|
copying an old backup over to my ccollect destination, I adjusted
|
||||||
|
mtime and atime where needed using touch, e.g.:
|
||||||
|
|
||||||
|
touch -d"28 Apr 2009 3:00" destination/daily.01
|
||||||
|
|
||||||
|
However, as far as I know, there is no way to correct a bad ctime.
|
||||||
|
I ran into this issue repeatedly while adjusting my backup
|
||||||
|
configuration. (For example, "cp -a" preserves mtime but not
|
||||||
|
ctime. Even worse, "cp -al old new" also changes ctime on old.)
|
||||||
|
|
||||||
|
Another potential problem with ctime is that it is file-system
|
||||||
|
dependent: I have read that Windows sets ctime to create-time not
|
||||||
|
last change-time.
|
||||||
|
|
||||||
|
However, It is simple to give a new backup the correct mtime.
|
||||||
|
After the rsync step, I added the command:
|
||||||
|
|
||||||
|
553a616,617
|
||||||
|
> # Correct the modification time:
|
||||||
|
> pcmd touch "${destination_dir}"
|
||||||
|
|
||||||
|
Even if ccollect continues to use ctime for sorting, I see no
|
||||||
|
reason not to have the backup directory have the correct mtime.
|
||||||
|
|
||||||
|
To allow the rest of the code to use either ctime or mtime, I
|
||||||
|
added definitions:
|
||||||
|
|
||||||
|
44a45,47
|
||||||
|
> #TSORT="tc" ; NEWER="cnewer"
|
||||||
|
> TSORT="t" ; NEWER="newer"
|
||||||
|
|
||||||
|
(It would be better if this choice was user-configurable because
|
||||||
|
those with existing backup directories should continue to use ctime
|
||||||
|
until the mtimes of their directories are correct. The correction
|
||||||
|
would happen passively over time as new backups created using the
|
||||||
|
above touch command and the old ones are deleted.)
|
||||||
|
|
||||||
|
With these definitions, the proper link-dest directory can then be
|
||||||
|
found using this minor change (and comment update):
|
||||||
|
|
||||||
|
516,519c579,582
|
||||||
|
< # Use ls -1c instead of -1t, because last modification maybe the same on all
|
||||||
|
< # and metadate update (-c) is updated by rsync locally.
|
||||||
|
< #
|
||||||
|
< last_dir="$(pcmd ls -tcp1 "${ddir}" | grep '/$' | head -n 1)" || \
|
||||||
|
---
|
||||||
|
> # Depending on your file system, you may want to sort on:
|
||||||
|
> # 1. mtime (modification time) with TSORT=t, or
|
||||||
|
> # 2. ctime (last change time, usually) with TSORT=tc
|
||||||
|
> last_dir="$(pcmd ls -${TSORT}p1 "${ddir}" | grep '/$' | head -n 1)" || \
|
||||||
|
|
||||||
|
Thirdly, after I copied my old backups over to my ccollect
|
||||||
|
destination directory, I found that ccollect would delete a
|
||||||
|
recent backup not an old backup! My problem was that, unknown to
|
||||||
|
me, the algorithm to find the oldest backup (for deletion) was
|
||||||
|
inconsistent with that used to find the newest (for link-dest). I
|
||||||
|
suggest that these two should be consistent. Because time-sorting
|
||||||
|
seemed more consistent with the ccollect documentation, I suggest:
|
||||||
|
|
||||||
|
492,493c555,556
|
||||||
|
< pcmd ls -p1 "$ddir" | grep "^${INTERVAL}\..*/\$" | \
|
||||||
|
< sort -n | head -n "${remove}" > "${TMP}" || \
|
||||||
|
---
|
||||||
|
> pcmd ls -${TSORT}p1r "$ddir" | grep "^${INTERVAL}\..*/\$" | \
|
||||||
|
> head -n "${remove}" > "${TMP}" || \
|
||||||
|
|
||||||
|
Fourthly, in my experience, rsync error code 12 means complete
|
||||||
|
failure, usually because the source refuses the ssh connection.
|
||||||
|
So, I left the marker in that case:
|
||||||
|
|
||||||
|
558,559c622,625
|
||||||
|
< pcmd rm "${destination_dir}.${c_marker}" || \
|
||||||
|
< _exit_err "Removing ${destination_dir}/${c_marker} failed."
|
||||||
|
---
|
||||||
|
> if [ "$ret" -ne 12 ] ; then
|
||||||
|
> pcmd rm "${destination_dir}.${c_marker}" || \
|
||||||
|
> _exit_err "Removing ${destination_dir}/${c_marker} failed."
|
||||||
|
> fi
|
||||||
|
|
||||||
|
(A better solution might allow a user-configurable list of error
|
||||||
|
codes that are treated the same as a fail.)
|
||||||
|
|
||||||
|
Fifth, because I was frustrated by the problems of having a
|
||||||
|
cron-job decide which interval to backup, I added a major new
|
||||||
|
feature: the modified ccollect can now automatically select an
|
||||||
|
interval to use for backup.
|
||||||
|
|
||||||
|
Cron-job controlled backup works well if all machines are up and
|
||||||
|
running all the time and nothing ever goes wrong. I have, however,
|
||||||
|
some machines that are occasionally turned off, or that are mobile
|
||||||
|
and only sometimes connected to local net. For these machines, the
|
||||||
|
use of cron-jobs to select intervals can be a disaster.
|
||||||
|
|
||||||
|
There are several ways one could automatically choose an
|
||||||
|
appropriate interval. The method I show below has the advantage
|
||||||
|
that it works with existing ccollect configuration files. The only
|
||||||
|
requirement is that interval names be chosen to sort nicely (under
|
||||||
|
ls). For example, I currently use:
|
||||||
|
|
||||||
|
$ ls -1 intervals
|
||||||
|
a_daily
|
||||||
|
b_weekly
|
||||||
|
c_monthly
|
||||||
|
d_quarterly
|
||||||
|
e_yearly
|
||||||
|
$ cat intervals/*
|
||||||
|
6
|
||||||
|
3
|
||||||
|
2
|
||||||
|
3
|
||||||
|
30
|
||||||
|
|
||||||
|
A simpler example would be:
|
||||||
|
|
||||||
|
$ ls -1 intervals
|
||||||
|
int1
|
||||||
|
int2
|
||||||
|
int3
|
||||||
|
$ cat intervals/*
|
||||||
|
2
|
||||||
|
3
|
||||||
|
4
|
||||||
|
|
||||||
|
The algorithm works as follows:
|
||||||
|
|
||||||
|
If no backup exists for the least frequent interval (int3 in the
|
||||||
|
simpler example), then use that interval. Otherwise, use the
|
||||||
|
most frequent interval (int1) unless there are "$(cat
|
||||||
|
intervals/int1)" int1 backups more recent than any int2 or int3
|
||||||
|
backup, in which case select int2 unless there are "$(cat
|
||||||
|
intervals/int2)" int2 backups more recent than any int3 backups
|
||||||
|
in which case choose int3.
|
||||||
|
|
||||||
|
This algorithm works well cycling through all the backups for my
|
||||||
|
always connected machines as well as for my usually connected
|
||||||
|
machines, and rarely connected machines. (For a rarely connected
|
||||||
|
machine, interval names like "b_weekly" lose their English meaning
|
||||||
|
but it still does a reasonable job of rotating through the
|
||||||
|
intervals.)
|
||||||
|
|
||||||
|
In addition to being more robust, the automatic interval
|
||||||
|
selection means that crontab is greatly simplified: only one line
|
||||||
|
is needed. I use:
|
||||||
|
|
||||||
|
30 3 * * * ccollect.sh AUTO host1 host2 host3 | tee -a /var/log/ccollect-full.log | ccollect_analyse_logs.sh iwe
|
||||||
|
|
||||||
|
Some users might prefer a calendar-driven algorithm such as: do
|
||||||
|
a yearly backup the first time a machine is connected during a new
|
||||||
|
year; do a monthly backup the first that a machine is connected
|
||||||
|
during a month; etc. This, however, would require a change to the
|
||||||
|
ccollect configuration files. So, I didn't pursue the idea any
|
||||||
|
further.
|
||||||
|
|
||||||
|
The code checks to see if the user specified the interval as
|
||||||
|
AUTO. If so, the auto_interval function is called to select the
|
||||||
|
interval:
|
||||||
|
|
||||||
|
347a417,420
|
||||||
|
> if [ ${INTERVAL} = "AUTO" ] ; then
|
||||||
|
> auto_interval
|
||||||
|
> _techo "Selected interval: '$INTERVAL'"
|
||||||
|
> fi
|
||||||
|
|
||||||
|
The code for auto_interval is as follows (note that it allows 'more
|
||||||
|
recent' to be defined by either ctime or mtime as per the TSORT
|
||||||
|
variable):
|
||||||
|
|
||||||
|
125a129,182
|
||||||
|
> # Select interval if AUTO
|
||||||
|
> #
|
||||||
|
> # For this to work nicely, you have to choose interval names that sort nicely
|
||||||
|
> # such as int1, int2, int3 or a_daily, b_weekly, c_monthly, etc.
|
||||||
|
> #
|
||||||
|
> auto_interval()
|
||||||
|
> {
|
||||||
|
> if [ -d "${backup}/intervals" -a -n "$(ls "${backup}/intervals" 2>/dev/null)" ] ; then
|
||||||
|
> intervals_dir="${backup}/intervals"
|
||||||
|
> elif [ -d "${CDEFAULTS}/intervals" -a -n "$(ls "${CDEFAULTS}/intervals" 2>/dev/null)" ] ; then
|
||||||
|
> intervals_dir="${CDEFAULTS}/intervals"
|
||||||
|
> else
|
||||||
|
> _exit_err "No intervals are defined. Skipping."
|
||||||
|
> fi
|
||||||
|
> echo intervals_dir=${intervals_dir}
|
||||||
|
>
|
||||||
|
> trial_interval="$(ls -1r "${intervals_dir}/" | head -n 1)" || \
|
||||||
|
> _exit_err "Failed to list contents of ${intervals_dir}/."
|
||||||
|
> _techo "Considering interval ${trial_interval}"
|
||||||
|
> most_recent="$(pcmd ls -${TSORT}p1 "${ddir}" | grep "^${trial_interval}.*/$" | head -n 1)" || \
|
||||||
|
> _exit_err "Failed to list contents of ${ddir}/."
|
||||||
|
> _techo " Most recent ${trial_interval}: '${most_recent}'"
|
||||||
|
> if [ -n "${most_recent}" ]; then
|
||||||
|
> no_intervals="$(ls -1 "${intervals_dir}/" | wc -l)"
|
||||||
|
> n=1
|
||||||
|
> while [ "${n}" -le "${no_intervals}" ]; do
|
||||||
|
> trial_interval="$(ls -p1 "${intervals_dir}/" | tail -n+${n} | head -n 1)"
|
||||||
|
> _techo "Considering interval '${trial_interval}'"
|
||||||
|
> c_interval="$(cat "${intervals_dir}/${trial_interval}" 2>/dev/null)"
|
||||||
|
> m=$((${n}+1))
|
||||||
|
> set -- "${ddir}" -maxdepth 1
|
||||||
|
> while [ "${m}" -le "${no_intervals}" ]; do
|
||||||
|
> interval_m="$(ls -1 "${intervals_dir}/" | tail -n+${m} | head -n 1)"
|
||||||
|
> most_recent="$(pcmd ls -${TSORT}p1 "${ddir}" | grep "^${interval_m}\..*/$" | head -n 1)"
|
||||||
|
> _techo " Most recent ${interval_m}: '${most_recent}'"
|
||||||
|
> if [ -n "${most_recent}" ] ; then
|
||||||
|
> set -- "$@" -$NEWER "${ddir}/${most_recent}"
|
||||||
|
> fi
|
||||||
|
> m=$((${m}+1))
|
||||||
|
> done
|
||||||
|
> count=$(pcmd find "$@" -iname "${trial_interval}*" | wc -l)
|
||||||
|
> _techo " Found $count more recent backups of ${trial_interval} (limit: ${c_interval})"
|
||||||
|
> if [ "$count" -lt "${c_interval}" ] ; then
|
||||||
|
> break
|
||||||
|
> fi
|
||||||
|
> n=$((${n}+1))
|
||||||
|
> done
|
||||||
|
> fi
|
||||||
|
> export INTERVAL="${trial_interval}"
|
||||||
|
> D_FILE_INTERVAL="${intervals_dir}/${INTERVAL}"
|
||||||
|
> D_INTERVAL=$(cat "${D_FILE_INTERVAL}" 2>/dev/null)
|
||||||
|
> }
|
||||||
|
>
|
||||||
|
> #
|
||||||
|
|
||||||
|
While I consider the auto_interval code to be developmental, I have
|
||||||
|
been using it for my nightly backups and it works for me.
|
||||||
|
|
||||||
|
One last change: For auto_interval to work, it needs "ddir" to
|
||||||
|
be defined first. Consequently, I had to move the following code
|
||||||
|
so it gets run before auto_interval is called:
|
||||||
|
|
||||||
|
369,380c442,443
|
||||||
|
<
|
||||||
|
< #
|
||||||
|
< # Destination is a path
|
||||||
|
< #
|
||||||
|
< if [ ! -f "${c_dest}" ]; then
|
||||||
|
< _exit_err "Destination ${c_dest} is not a file. Skipping."
|
||||||
|
< else
|
||||||
|
< ddir=$(cat "${c_dest}"); ret="$?"
|
||||||
|
< if [ "${ret}" -ne 0 ]; then
|
||||||
|
< _exit_err "Destination ${c_dest} is not readable. Skipping."
|
||||||
|
< fi
|
||||||
|
< fi
|
||||||
|
345a403,414
|
||||||
|
> # Destination is a path
|
||||||
|
> #
|
||||||
|
> if [ ! -f "${c_dest}" ]; then
|
||||||
|
> _exit_err "Destination ${c_dest} is not a file. Skipping."
|
||||||
|
> else
|
||||||
|
> ddir=$(cat "${c_dest}"); ret="$?"
|
||||||
|
> if [ "${ret}" -ne 0 ]; then
|
||||||
|
> _exit_err "Destination ${c_dest} is not readable. Skipping."
|
||||||
|
> fi
|
||||||
|
> fi
|
||||||
|
>
|
||||||
|
> #
|
||||||
|
|
||||||
|
I have some other ideas but this is all I have implemented at
|
||||||
|
the moment. Files are attached.
|
||||||
|
|
||||||
|
Thanks again for developing ccollect and let me know what you
|
||||||
|
think.
|
||||||
|
|
||||||
|
Regards,
|
||||||
|
|
||||||
|
John
|
||||||
|
|
||||||
|
--
|
||||||
|
John L. Lawless, Ph.D.
|
||||||
|
Redwood Scientific, Inc.
|
||||||
|
1005 Terra Nova Blvd
|
||||||
|
Pacifica, CA 94044-4300
|
||||||
|
1-650-738-8083
|
||||||
|
|
||||||
15
contrib/jlawless-2009-06-03/old/a.patch
Normal file
15
contrib/jlawless-2009-06-03/old/a.patch
Normal file
|
|
@ -0,0 +1,15 @@
|
||||||
|
--- ccollect-0.7.1.sh 2009-02-02 03:39:42.000000000 -0800
|
||||||
|
+++ ccollect-0.7.1-a.sh 2009-05-24 21:30:38.000000000 -0700
|
||||||
|
@@ -364,10 +364,12 @@
|
||||||
|
source=$(cat "${c_source}"); ret="$?"
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_exit_err "Source ${c_source} is not readable. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
+ # Verify source is up and accepting connections before deleting any old backups
|
||||||
|
+ rsync "$source" >/dev/null || _exit_err "Source ${source} is not readable. Skipping."
|
||||||
|
|
||||||
|
#
|
||||||
|
# Destination is a path
|
||||||
|
#
|
||||||
|
if [ ! -f "${c_dest}" ]; then
|
||||||
15
contrib/jlawless-2009-06-03/old/b.patch
Normal file
15
contrib/jlawless-2009-06-03/old/b.patch
Normal file
|
|
@ -0,0 +1,15 @@
|
||||||
|
--- ccollect-0.7.1-a.sh 2009-05-24 21:30:38.000000000 -0700
|
||||||
|
+++ ccollect-0.7.1-b.sh 2009-05-24 21:32:00.000000000 -0700
|
||||||
|
@@ -551,10 +551,12 @@
|
||||||
|
# the rsync part
|
||||||
|
#
|
||||||
|
|
||||||
|
_techo "Transferring files..."
|
||||||
|
rsync "$@" "${source}" "${destination_full}"; ret=$?
|
||||||
|
+ # Correct the modification time:
|
||||||
|
+ pcmd touch "${destination_dir}"
|
||||||
|
|
||||||
|
#
|
||||||
|
# remove marking here
|
||||||
|
#
|
||||||
|
pcmd rm "${destination_dir}.${c_marker}" || \
|
||||||
35
contrib/jlawless-2009-06-03/old/c.patch
Normal file
35
contrib/jlawless-2009-06-03/old/c.patch
Normal file
|
|
@ -0,0 +1,35 @@
|
||||||
|
--- ccollect-0.7.1-b.sh 2009-05-24 21:32:00.000000000 -0700
|
||||||
|
+++ ccollect-0.7.1-c.sh 2009-05-24 21:39:43.000000000 -0700
|
||||||
|
@@ -40,10 +40,13 @@
|
||||||
|
VERSION=0.7.1
|
||||||
|
RELEASE="2009-02-02"
|
||||||
|
HALF_VERSION="ccollect ${VERSION}"
|
||||||
|
FULL_VERSION="ccollect ${VERSION} (${RELEASE})"
|
||||||
|
|
||||||
|
+#TSORT="tc" ; NEWER="cnewer"
|
||||||
|
+TSORT="t" ; NEWER="newer"
|
||||||
|
+
|
||||||
|
#
|
||||||
|
# CDATE: how we use it for naming of the archives
|
||||||
|
# DDATE: how the user should see it in our output (DISPLAY)
|
||||||
|
#
|
||||||
|
CDATE="date +%Y%m%d-%H%M"
|
||||||
|
@@ -513,14 +516,14 @@
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check for backup directory to clone from: Always clone from the latest one!
|
||||||
|
#
|
||||||
|
- # Use ls -1c instead of -1t, because last modification maybe the same on all
|
||||||
|
- # and metadate update (-c) is updated by rsync locally.
|
||||||
|
- #
|
||||||
|
- last_dir="$(pcmd ls -tcp1 "${ddir}" | grep '/$' | head -n 1)" || \
|
||||||
|
+ # Depending on your file system, you may want to sort on:
|
||||||
|
+ # 1. mtime (modification time) with TSORT=t, or
|
||||||
|
+ # 2. ctime (last change time, usually) with TSORT=tc
|
||||||
|
+ last_dir="$(pcmd ls -${TSORT}p1 "${ddir}" | grep '/$' | head -n 1)" || \
|
||||||
|
_exit_err "Failed to list contents of ${ddir}."
|
||||||
|
|
||||||
|
#
|
||||||
|
# clone from old backup, if existing
|
||||||
|
#
|
||||||
|
|
@ -1,6 +1,6 @@
|
||||||
#!/bin/sh
|
#!/bin/sh
|
||||||
#
|
#
|
||||||
# 2005-2008 Nico Schottelius (nico-ccollect at schottelius.org)
|
# 2005-2009 Nico Schottelius (nico-ccollect at schottelius.org)
|
||||||
#
|
#
|
||||||
# This file is part of ccollect.
|
# This file is part of ccollect.
|
||||||
#
|
#
|
||||||
|
|
@ -20,6 +20,13 @@
|
||||||
# Initially written for SyGroup (www.sygroup.ch)
|
# Initially written for SyGroup (www.sygroup.ch)
|
||||||
# Date: Mon Nov 14 11:45:11 CET 2005
|
# Date: Mon Nov 14 11:45:11 CET 2005
|
||||||
|
|
||||||
|
#
|
||||||
|
# Standard variables (stolen from cconf)
|
||||||
|
#
|
||||||
|
__pwd="$(pwd -P)"
|
||||||
|
__mydir="${0%/*}"; __abs_mydir="$(cd "$__mydir" && pwd -P)"
|
||||||
|
__myname=${0##*/}; __abs_myname="$__abs_mydir/$__myname"
|
||||||
|
|
||||||
#
|
#
|
||||||
# where to find our configuration and temporary file
|
# where to find our configuration and temporary file
|
||||||
#
|
#
|
||||||
|
|
@ -29,9 +36,9 @@ CDEFAULTS=${CCOLLECT_CONF}/defaults
|
||||||
CPREEXEC="${CDEFAULTS}/pre_exec"
|
CPREEXEC="${CDEFAULTS}/pre_exec"
|
||||||
CPOSTEXEC="${CDEFAULTS}/post_exec"
|
CPOSTEXEC="${CDEFAULTS}/post_exec"
|
||||||
|
|
||||||
TMP=$(mktemp "/tmp/$(basename $0).XXXXXX")
|
TMP=$(mktemp "/tmp/${__myname}.XXXXXX")
|
||||||
VERSION=0.7.0
|
VERSION=0.7.1
|
||||||
RELEASE="2008-03-17"
|
RELEASE="2009-02-02"
|
||||||
HALF_VERSION="ccollect ${VERSION}"
|
HALF_VERSION="ccollect ${VERSION}"
|
||||||
FULL_VERSION="ccollect ${VERSION} (${RELEASE})"
|
FULL_VERSION="ccollect ${VERSION} (${RELEASE})"
|
||||||
|
|
||||||
|
|
@ -72,7 +79,7 @@ _exit_err()
|
||||||
|
|
||||||
add_name()
|
add_name()
|
||||||
{
|
{
|
||||||
sed "s:^:\[${name}\] :"
|
awk "{ print \"[${name}] \" \$0 }"
|
||||||
}
|
}
|
||||||
|
|
||||||
pcmd()
|
pcmd()
|
||||||
|
|
@ -98,7 +105,7 @@ display_version()
|
||||||
#
|
#
|
||||||
usage()
|
usage()
|
||||||
{
|
{
|
||||||
echo "$(basename $0): <interval name> [args] <sources to backup>"
|
echo "${__myname}: <interval name> [args] <sources to backup>"
|
||||||
echo ""
|
echo ""
|
||||||
echo " ccollect creates (pseudo) incremental backups"
|
echo " ccollect creates (pseudo) incremental backups"
|
||||||
echo ""
|
echo ""
|
||||||
|
|
@ -281,7 +288,8 @@ while [ "${i}" -lt "${no_sources}" ]; do
|
||||||
c_summary="${backup}/summary"
|
c_summary="${backup}/summary"
|
||||||
c_pre_exec="${backup}/pre_exec"
|
c_pre_exec="${backup}/pre_exec"
|
||||||
c_post_exec="${backup}/post_exec"
|
c_post_exec="${backup}/post_exec"
|
||||||
c_incomplete="${backup}/delete_incomplete"
|
f_incomplete="delete_incomplete"
|
||||||
|
c_incomplete="${backup}/${f_incomplete}"
|
||||||
c_remote_host="${backup}/remote_host"
|
c_remote_host="${backup}/remote_host"
|
||||||
|
|
||||||
#
|
#
|
||||||
|
|
@ -390,13 +398,13 @@ while [ "${i}" -lt "${no_sources}" ]; do
|
||||||
#
|
#
|
||||||
# check for existence / use real name
|
# check for existence / use real name
|
||||||
#
|
#
|
||||||
pcmd cd "$ddir" || _exit_err "Cannot change to ${ddir}. Skipping."
|
( pcmd cd "$ddir" ) || _exit_err "Cannot change to ${ddir}. Skipping."
|
||||||
|
|
||||||
|
|
||||||
#
|
#
|
||||||
# Check whether to delete incomplete backups
|
# Check whether to delete incomplete backups
|
||||||
#
|
#
|
||||||
if [ -f "${c_incomplete}" ]; then
|
if [ -f "${c_incomplete}" -o -f "${CDEFAULTS}/${f_incomplete}" ]; then
|
||||||
DELETE_INCOMPLETE="yes"
|
DELETE_INCOMPLETE="yes"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
|
@ -464,7 +472,7 @@ while [ "${i}" -lt "${no_sources}" ]; do
|
||||||
pcmd rm $VVERBOSE -rf "${ddir}/${realincomplete}" || \
|
pcmd rm $VVERBOSE -rf "${ddir}/${realincomplete}" || \
|
||||||
_exit_err "Removing ${realincomplete} failed."
|
_exit_err "Removing ${realincomplete} failed."
|
||||||
fi
|
fi
|
||||||
j=$((j+1))
|
j=$(($j+1))
|
||||||
done
|
done
|
||||||
|
|
||||||
#
|
#
|
||||||
|
|
@ -497,7 +505,7 @@ while [ "${i}" -lt "${no_sources}" ]; do
|
||||||
_techo "Removing ${to_remove} ..."
|
_techo "Removing ${to_remove} ..."
|
||||||
pcmd rm ${VVERBOSE} -rf "${ddir}/${to_remove}" || \
|
pcmd rm ${VVERBOSE} -rf "${ddir}/${to_remove}" || \
|
||||||
_exit_err "Removing ${to_remove} failed."
|
_exit_err "Removing ${to_remove} failed."
|
||||||
j=$((j+1))
|
j=$(($j+1))
|
||||||
done
|
done
|
||||||
fi
|
fi
|
||||||
|
|
||||||
683
contrib/jlawless-2009-06-03/old/ccollect-f.sh
Executable file
683
contrib/jlawless-2009-06-03/old/ccollect-f.sh
Executable file
|
|
@ -0,0 +1,683 @@
|
||||||
|
#!/bin/sh
|
||||||
|
#
|
||||||
|
# 2005-2009 Nico Schottelius (nico-ccollect at schottelius.org)
|
||||||
|
#
|
||||||
|
# This file is part of ccollect.
|
||||||
|
#
|
||||||
|
# ccollect is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# ccollect is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with ccollect. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
#
|
||||||
|
# Initially written for SyGroup (www.sygroup.ch)
|
||||||
|
# Date: Mon Nov 14 11:45:11 CET 2005
|
||||||
|
|
||||||
|
#
|
||||||
|
# Standard variables (stolen from cconf)
|
||||||
|
#
|
||||||
|
__pwd="$(pwd -P)"
|
||||||
|
__mydir="${0%/*}"; __abs_mydir="$(cd "$__mydir" && pwd -P)"
|
||||||
|
__myname=${0##*/}; __abs_myname="$__abs_mydir/$__myname"
|
||||||
|
|
||||||
|
#
|
||||||
|
# where to find our configuration and temporary file
|
||||||
|
#
|
||||||
|
CCOLLECT_CONF=${CCOLLECT_CONF:-/etc/ccollect}
|
||||||
|
CSOURCES=${CCOLLECT_CONF}/sources
|
||||||
|
CDEFAULTS=${CCOLLECT_CONF}/defaults
|
||||||
|
CPREEXEC="${CDEFAULTS}/pre_exec"
|
||||||
|
CPOSTEXEC="${CDEFAULTS}/post_exec"
|
||||||
|
|
||||||
|
TMP=$(mktemp "/tmp/${__myname}.XXXXXX")
|
||||||
|
VERSION=0.7.1
|
||||||
|
RELEASE="2009-02-02"
|
||||||
|
HALF_VERSION="ccollect ${VERSION}"
|
||||||
|
FULL_VERSION="ccollect ${VERSION} (${RELEASE})"
|
||||||
|
|
||||||
|
#TSORT="tc" ; NEWER="cnewer"
|
||||||
|
TSORT="t" ; NEWER="newer"
|
||||||
|
|
||||||
|
#
|
||||||
|
# CDATE: how we use it for naming of the archives
|
||||||
|
# DDATE: how the user should see it in our output (DISPLAY)
|
||||||
|
#
|
||||||
|
CDATE="date +%Y%m%d-%H%M"
|
||||||
|
DDATE="date +%Y-%m-%d-%H:%M:%S"
|
||||||
|
|
||||||
|
#
|
||||||
|
# unset parallel execution
|
||||||
|
#
|
||||||
|
PARALLEL=""
|
||||||
|
|
||||||
|
#
|
||||||
|
# catch signals
|
||||||
|
#
|
||||||
|
trap "rm -f \"${TMP}\"" 1 2 15
|
||||||
|
|
||||||
|
#
|
||||||
|
# Functions
|
||||||
|
#
|
||||||
|
|
||||||
|
# time displaying echo
|
||||||
|
_techo()
|
||||||
|
{
|
||||||
|
echo "$(${DDATE}): $@"
|
||||||
|
}
|
||||||
|
|
||||||
|
# exit on error
|
||||||
|
_exit_err()
|
||||||
|
{
|
||||||
|
_techo "$@"
|
||||||
|
rm -f "${TMP}"
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
|
||||||
|
add_name()
|
||||||
|
{
|
||||||
|
awk "{ print \"[${name}] \" \$0 }"
|
||||||
|
}
|
||||||
|
|
||||||
|
pcmd()
|
||||||
|
{
|
||||||
|
if [ "$remote_host" ]; then
|
||||||
|
ssh "$remote_host" "$@"
|
||||||
|
else
|
||||||
|
"$@"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# Version
|
||||||
|
#
|
||||||
|
display_version()
|
||||||
|
{
|
||||||
|
echo "${FULL_VERSION}"
|
||||||
|
exit 0
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# Tell how to use us
|
||||||
|
#
|
||||||
|
usage()
|
||||||
|
{
|
||||||
|
echo "${__myname}: <interval name> [args] <sources to backup>"
|
||||||
|
echo ""
|
||||||
|
echo " ccollect creates (pseudo) incremental backups"
|
||||||
|
echo ""
|
||||||
|
echo " -h, --help: Show this help screen"
|
||||||
|
echo " -p, --parallel: Parallelise backup processes"
|
||||||
|
echo " -a, --all: Backup all sources specified in ${CSOURCES}"
|
||||||
|
echo " -v, --verbose: Be very verbose (uses set -x)"
|
||||||
|
echo " -V, --version: Print version information"
|
||||||
|
echo ""
|
||||||
|
echo " This is version ${VERSION}, released on ${RELEASE}"
|
||||||
|
echo " (the first version was written on 2005-12-05 by Nico Schottelius)."
|
||||||
|
echo ""
|
||||||
|
echo " Retrieve latest ccollect at http://unix.schottelius.org/ccollect/"
|
||||||
|
exit 0
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# Select interval if AUTO
|
||||||
|
#
|
||||||
|
# For this to work nicely, you have to choose interval names that sort nicely
|
||||||
|
# such as int1, int2, int3 or a_daily, b_weekly, c_monthly, etc.
|
||||||
|
#
|
||||||
|
auto_interval()
|
||||||
|
{
|
||||||
|
if [ -d "${backup}/intervals" -a -n "$(ls "${backup}/intervals" 2>/dev/null)" ] ; then
|
||||||
|
intervals_dir="${backup}/intervals"
|
||||||
|
elif [ -d "${CDEFAULTS}/intervals" -a -n "$(ls "${CDEFAULTS}/intervals" 2>/dev/null)" ] ; then
|
||||||
|
intervals_dir="${CDEFAULTS}/intervals"
|
||||||
|
else
|
||||||
|
_exit_err "No intervals are defined. Skipping."
|
||||||
|
fi
|
||||||
|
echo intervals_dir=${intervals_dir}
|
||||||
|
|
||||||
|
trial_interval="$(ls -1r "${intervals_dir}/" | head -n 1)" || \
|
||||||
|
_exit_err "Failed to list contents of ${intervals_dir}/."
|
||||||
|
_techo "Considering interval ${trial_interval}"
|
||||||
|
most_recent="$(pcmd ls -${TSORT}p1 "${ddir}" | grep "^${trial_interval}.*/$" | head -n 1)" || \
|
||||||
|
_exit_err "Failed to list contents of ${ddir}/."
|
||||||
|
_techo " Most recent ${trial_interval}: '${most_recent}'"
|
||||||
|
if [ -n "${most_recent}" ]; then
|
||||||
|
no_intervals="$(ls -1 "${intervals_dir}/" | wc -l)"
|
||||||
|
n=1
|
||||||
|
while [ "${n}" -le "${no_intervals}" ]; do
|
||||||
|
trial_interval="$(ls -p1 "${intervals_dir}/" | tail -n+${n} | head -n 1)"
|
||||||
|
_techo "Considering interval '${trial_interval}'"
|
||||||
|
c_interval="$(cat "${intervals_dir}/${trial_interval}" 2>/dev/null)"
|
||||||
|
m=$((${n}+1))
|
||||||
|
set -- "${ddir}" -maxdepth 1
|
||||||
|
while [ "${m}" -le "${no_intervals}" ]; do
|
||||||
|
interval_m="$(ls -1 "${intervals_dir}/" | tail -n+${m} | head -n 1)"
|
||||||
|
most_recent="$(pcmd ls -${TSORT}p1 "${ddir}" | grep "^${interval_m}\..*/$" | head -n 1)"
|
||||||
|
_techo " Most recent ${interval_m}: '${most_recent}'"
|
||||||
|
if [ -n "${most_recent}" ] ; then
|
||||||
|
set -- "$@" -$NEWER "${ddir}/${most_recent}"
|
||||||
|
fi
|
||||||
|
m=$((${m}+1))
|
||||||
|
done
|
||||||
|
count=$(pcmd find "$@" -iname "${trial_interval}*" | wc -l)
|
||||||
|
_techo " Found $count more recent backups of ${trial_interval} (limit: ${c_interval})"
|
||||||
|
if [ "$count" -lt "${c_interval}" ] ; then
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
n=$((${n}+1))
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
export INTERVAL="${trial_interval}"
|
||||||
|
D_FILE_INTERVAL="${intervals_dir}/${INTERVAL}"
|
||||||
|
D_INTERVAL=$(cat "${D_FILE_INTERVAL}" 2>/dev/null)
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# need at least interval and one source or --all
|
||||||
|
#
|
||||||
|
if [ $# -lt 2 ]; then
|
||||||
|
if [ "$1" = "-V" -o "$1" = "--version" ]; then
|
||||||
|
display_version
|
||||||
|
else
|
||||||
|
usage
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# check for configuraton directory
|
||||||
|
#
|
||||||
|
[ -d "${CCOLLECT_CONF}" ] || _exit_err "No configuration found in " \
|
||||||
|
"\"${CCOLLECT_CONF}\" (is \$CCOLLECT_CONF properly set?)"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Filter arguments
|
||||||
|
#
|
||||||
|
export INTERVAL="$1"; shift
|
||||||
|
i=1
|
||||||
|
no_sources=0
|
||||||
|
|
||||||
|
#
|
||||||
|
# Create source "array"
|
||||||
|
#
|
||||||
|
while [ "$#" -ge 1 ]; do
|
||||||
|
eval arg=\"\$1\"; shift
|
||||||
|
|
||||||
|
if [ "${NO_MORE_ARGS}" = 1 ]; then
|
||||||
|
eval source_${no_sources}=\"${arg}\"
|
||||||
|
no_sources=$((${no_sources}+1))
|
||||||
|
|
||||||
|
# make variable available for subscripts
|
||||||
|
eval export source_${no_sources}
|
||||||
|
else
|
||||||
|
case "${arg}" in
|
||||||
|
-a|--all)
|
||||||
|
ALL=1
|
||||||
|
;;
|
||||||
|
-v|--verbose)
|
||||||
|
VERBOSE=1
|
||||||
|
;;
|
||||||
|
-p|--parallel)
|
||||||
|
PARALLEL=1
|
||||||
|
;;
|
||||||
|
-h|--help)
|
||||||
|
usage
|
||||||
|
;;
|
||||||
|
--)
|
||||||
|
NO_MORE_ARGS=1
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
eval source_${no_sources}=\"$arg\"
|
||||||
|
no_sources=$(($no_sources+1))
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
fi
|
||||||
|
|
||||||
|
i=$(($i+1))
|
||||||
|
done
|
||||||
|
|
||||||
|
# also export number of sources
|
||||||
|
export no_sources
|
||||||
|
|
||||||
|
#
|
||||||
|
# be really, really, really verbose
|
||||||
|
#
|
||||||
|
if [ "${VERBOSE}" = 1 ]; then
|
||||||
|
set -x
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Look, if we should take ALL sources
|
||||||
|
#
|
||||||
|
if [ "${ALL}" = 1 ]; then
|
||||||
|
# reset everything specified before
|
||||||
|
no_sources=0
|
||||||
|
|
||||||
|
#
|
||||||
|
# get entries from sources
|
||||||
|
#
|
||||||
|
cwd=$(pwd -P)
|
||||||
|
( cd "${CSOURCES}" && ls > "${TMP}" ); ret=$?
|
||||||
|
|
||||||
|
[ "${ret}" -eq 0 ] || _exit_err "Listing of sources failed. Aborting."
|
||||||
|
|
||||||
|
while read tmp; do
|
||||||
|
eval source_${no_sources}=\"${tmp}\"
|
||||||
|
no_sources=$((${no_sources}+1))
|
||||||
|
done < "${TMP}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Need at least ONE source to backup
|
||||||
|
#
|
||||||
|
if [ "${no_sources}" -lt 1 ]; then
|
||||||
|
usage
|
||||||
|
else
|
||||||
|
_techo "${HALF_VERSION}: Beginning backup using interval ${INTERVAL}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Look for pre-exec command (general)
|
||||||
|
#
|
||||||
|
if [ -x "${CPREEXEC}" ]; then
|
||||||
|
_techo "Executing ${CPREEXEC} ..."
|
||||||
|
"${CPREEXEC}"; ret=$?
|
||||||
|
_techo "Finished ${CPREEXEC} (return code: ${ret})."
|
||||||
|
|
||||||
|
[ "${ret}" -eq 0 ] || _exit_err "${CPREEXEC} failed. Aborting"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# check default configuration
|
||||||
|
#
|
||||||
|
|
||||||
|
D_FILE_INTERVAL="${CDEFAULTS}/intervals/${INTERVAL}"
|
||||||
|
D_INTERVAL=$(cat "${D_FILE_INTERVAL}" 2>/dev/null)
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Let's do the backup
|
||||||
|
#
|
||||||
|
i=0
|
||||||
|
while [ "${i}" -lt "${no_sources}" ]; do
|
||||||
|
|
||||||
|
#
|
||||||
|
# Get current source
|
||||||
|
#
|
||||||
|
eval name=\"\$source_${i}\"
|
||||||
|
i=$((${i}+1))
|
||||||
|
|
||||||
|
export name
|
||||||
|
|
||||||
|
#
|
||||||
|
# start ourself, if we want parallel execution
|
||||||
|
#
|
||||||
|
if [ "${PARALLEL}" ]; then
|
||||||
|
"$0" "${INTERVAL}" "${name}" &
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Start subshell for easy log editing
|
||||||
|
#
|
||||||
|
(
|
||||||
|
#
|
||||||
|
# Stderr to stdout, so we can produce nice logs
|
||||||
|
#
|
||||||
|
exec 2>&1
|
||||||
|
|
||||||
|
#
|
||||||
|
# Configuration
|
||||||
|
#
|
||||||
|
backup="${CSOURCES}/${name}"
|
||||||
|
c_source="${backup}/source"
|
||||||
|
c_dest="${backup}/destination"
|
||||||
|
c_exclude="${backup}/exclude"
|
||||||
|
c_verbose="${backup}/verbose"
|
||||||
|
c_vverbose="${backup}/very_verbose"
|
||||||
|
c_rsync_extra="${backup}/rsync_options"
|
||||||
|
c_summary="${backup}/summary"
|
||||||
|
c_pre_exec="${backup}/pre_exec"
|
||||||
|
c_post_exec="${backup}/post_exec"
|
||||||
|
f_incomplete="delete_incomplete"
|
||||||
|
c_incomplete="${backup}/${f_incomplete}"
|
||||||
|
c_remote_host="${backup}/remote_host"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Marking backups: If we abort it's not removed => Backup is broken
|
||||||
|
#
|
||||||
|
c_marker=".ccollect-marker"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Times
|
||||||
|
#
|
||||||
|
begin_s=$(date +%s)
|
||||||
|
|
||||||
|
#
|
||||||
|
# unset possible options
|
||||||
|
#
|
||||||
|
EXCLUDE=""
|
||||||
|
RSYNC_EXTRA=""
|
||||||
|
SUMMARY=""
|
||||||
|
VERBOSE=""
|
||||||
|
VVERBOSE=""
|
||||||
|
DELETE_INCOMPLETE=""
|
||||||
|
|
||||||
|
_techo "Beginning to backup"
|
||||||
|
|
||||||
|
#
|
||||||
|
# Standard configuration checks
|
||||||
|
#
|
||||||
|
if [ ! -e "${backup}" ]; then
|
||||||
|
_exit_err "Source does not exist."
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# configuration _must_ be a directory
|
||||||
|
#
|
||||||
|
if [ ! -d "${backup}" ]; then
|
||||||
|
_exit_err "\"${name}\" is not a cconfig-directory. Skipping."
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# first execute pre_exec, which may generate destination or other
|
||||||
|
# parameters
|
||||||
|
#
|
||||||
|
if [ -x "${c_pre_exec}" ]; then
|
||||||
|
_techo "Executing ${c_pre_exec} ..."
|
||||||
|
"${c_pre_exec}"; ret="$?"
|
||||||
|
_techo "Finished ${c_pre_exec} (return code ${ret})."
|
||||||
|
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_exit_err "${c_pre_exec} failed. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Destination is a path
|
||||||
|
#
|
||||||
|
if [ ! -f "${c_dest}" ]; then
|
||||||
|
_exit_err "Destination ${c_dest} is not a file. Skipping."
|
||||||
|
else
|
||||||
|
ddir=$(cat "${c_dest}"); ret="$?"
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_exit_err "Destination ${c_dest} is not readable. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# interval definition: First try source specific, fallback to default
|
||||||
|
#
|
||||||
|
if [ ${INTERVAL} = "AUTO" ] ; then
|
||||||
|
auto_interval
|
||||||
|
_techo "Selected interval: '$INTERVAL'"
|
||||||
|
fi
|
||||||
|
c_interval="$(cat "${backup}/intervals/${INTERVAL}" 2>/dev/null)"
|
||||||
|
|
||||||
|
if [ -z "${c_interval}" ]; then
|
||||||
|
c_interval="${D_INTERVAL}"
|
||||||
|
|
||||||
|
if [ -z "${c_interval}" ]; then
|
||||||
|
_exit_err "No definition for interval \"${INTERVAL}\" found. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Source checks
|
||||||
|
#
|
||||||
|
if [ ! -f "${c_source}" ]; then
|
||||||
|
_exit_err "Source description \"${c_source}\" is not a file. Skipping."
|
||||||
|
else
|
||||||
|
source=$(cat "${c_source}"); ret="$?"
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_exit_err "Source ${c_source} is not readable. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
# Verify source is up and accepting connections before deleting any old backups
|
||||||
|
rsync "$source" >/dev/null || _exit_err "Source ${source} is not readable. Skipping."
|
||||||
|
|
||||||
|
#
|
||||||
|
# do we backup to a remote host? then set pre-cmd
|
||||||
|
#
|
||||||
|
if [ -f "${c_remote_host}" ]; then
|
||||||
|
# adjust ls and co
|
||||||
|
remote_host=$(cat "${c_remote_host}"); ret="$?"
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_exit_err "Remote host file ${c_remote_host} exists, but is not readable. Skipping."
|
||||||
|
fi
|
||||||
|
destination="${remote_host}:${ddir}"
|
||||||
|
else
|
||||||
|
remote_host=""
|
||||||
|
destination="${ddir}"
|
||||||
|
fi
|
||||||
|
export remote_host
|
||||||
|
|
||||||
|
#
|
||||||
|
# check for existence / use real name
|
||||||
|
#
|
||||||
|
( pcmd cd "$ddir" ) || _exit_err "Cannot change to ${ddir}. Skipping."
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check whether to delete incomplete backups
|
||||||
|
#
|
||||||
|
if [ -f "${c_incomplete}" -o -f "${CDEFAULTS}/${f_incomplete}" ]; then
|
||||||
|
DELETE_INCOMPLETE="yes"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# NEW method as of 0.6:
|
||||||
|
# - insert ccollect default parameters
|
||||||
|
# - insert options
|
||||||
|
# - insert user options
|
||||||
|
|
||||||
|
#
|
||||||
|
# rsync standard options
|
||||||
|
#
|
||||||
|
|
||||||
|
set -- "$@" "--archive" "--delete" "--numeric-ids" "--relative" \
|
||||||
|
"--delete-excluded" "--sparse"
|
||||||
|
|
||||||
|
#
|
||||||
|
# exclude list
|
||||||
|
#
|
||||||
|
if [ -f "${c_exclude}" ]; then
|
||||||
|
set -- "$@" "--exclude-from=${c_exclude}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Output a summary
|
||||||
|
#
|
||||||
|
if [ -f "${c_summary}" ]; then
|
||||||
|
set -- "$@" "--stats"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Verbosity for rsync
|
||||||
|
#
|
||||||
|
if [ -f "${c_vverbose}" ]; then
|
||||||
|
set -- "$@" "-vv"
|
||||||
|
elif [ -f "${c_verbose}" ]; then
|
||||||
|
set -- "$@" "-v"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# extra options for rsync provided by the user
|
||||||
|
#
|
||||||
|
if [ -f "${c_rsync_extra}" ]; then
|
||||||
|
while read line; do
|
||||||
|
set -- "$@" "$line"
|
||||||
|
done < "${c_rsync_extra}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check for incomplete backups
|
||||||
|
#
|
||||||
|
pcmd ls -1 "$ddir/${INTERVAL}"*".${c_marker}" > "${TMP}" 2>/dev/null
|
||||||
|
|
||||||
|
i=0
|
||||||
|
while read incomplete; do
|
||||||
|
eval incomplete_$i=\"$(echo ${incomplete} | sed "s/\\.${c_marker}\$//")\"
|
||||||
|
i=$(($i+1))
|
||||||
|
done < "${TMP}"
|
||||||
|
|
||||||
|
j=0
|
||||||
|
while [ "$j" -lt "$i" ]; do
|
||||||
|
eval realincomplete=\"\$incomplete_$j\"
|
||||||
|
_techo "Incomplete backup: ${realincomplete}"
|
||||||
|
if [ "${DELETE_INCOMPLETE}" = "yes" ]; then
|
||||||
|
_techo "Deleting ${realincomplete} ..."
|
||||||
|
pcmd rm $VVERBOSE -rf "${ddir}/${realincomplete}" || \
|
||||||
|
_exit_err "Removing ${realincomplete} failed."
|
||||||
|
fi
|
||||||
|
j=$(($j+1))
|
||||||
|
done
|
||||||
|
|
||||||
|
#
|
||||||
|
# check if maximum number of backups is reached, if so remove
|
||||||
|
# use grep and ls -p so we only look at directories
|
||||||
|
#
|
||||||
|
count="$(pcmd ls -p1 "${ddir}" | grep "^${INTERVAL}\..*/\$" | wc -l \
|
||||||
|
| sed 's/^ *//g')" || _exit_err "Counting backups failed"
|
||||||
|
|
||||||
|
_techo "Existing backups: ${count} Total keeping backups: ${c_interval}"
|
||||||
|
|
||||||
|
if [ "${count}" -ge "${c_interval}" ]; then
|
||||||
|
substract=$((${c_interval} - 1))
|
||||||
|
remove=$((${count} - ${substract}))
|
||||||
|
_techo "Removing ${remove} backup(s)..."
|
||||||
|
|
||||||
|
pcmd ls -${TSORT}p1r "$ddir" | grep "^${INTERVAL}\..*/\$" | \
|
||||||
|
head -n "${remove}" > "${TMP}" || \
|
||||||
|
_exit_err "Listing old backups failed"
|
||||||
|
|
||||||
|
i=0
|
||||||
|
while read to_remove; do
|
||||||
|
eval remove_$i=\"${to_remove}\"
|
||||||
|
i=$(($i+1))
|
||||||
|
done < "${TMP}"
|
||||||
|
|
||||||
|
j=0
|
||||||
|
while [ "$j" -lt "$i" ]; do
|
||||||
|
eval to_remove=\"\$remove_$j\"
|
||||||
|
_techo "Removing ${to_remove} ..."
|
||||||
|
pcmd rm ${VVERBOSE} -rf "${ddir}/${to_remove}" || \
|
||||||
|
_exit_err "Removing ${to_remove} failed."
|
||||||
|
j=$(($j+1))
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Check for backup directory to clone from: Always clone from the latest one!
|
||||||
|
#
|
||||||
|
# Depending on your file system, you may want to sort on:
|
||||||
|
# 1. mtime (modification time) with TSORT=t, or
|
||||||
|
# 2. ctime (last change time, usually) with TSORT=tc
|
||||||
|
last_dir="$(pcmd ls -${TSORT}p1 "${ddir}" | grep '/$' | head -n 1)" || \
|
||||||
|
_exit_err "Failed to list contents of ${ddir}."
|
||||||
|
|
||||||
|
#
|
||||||
|
# clone from old backup, if existing
|
||||||
|
#
|
||||||
|
if [ "${last_dir}" ]; then
|
||||||
|
set -- "$@" "--link-dest=${ddir}/${last_dir}"
|
||||||
|
_techo "Hard linking from ${last_dir}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
|
||||||
|
# set time when we really begin to backup, not when we began to remove above
|
||||||
|
destination_date=$(${CDATE})
|
||||||
|
destination_dir="${ddir}/${INTERVAL}.${destination_date}.$$"
|
||||||
|
destination_full="${destination}/${INTERVAL}.${destination_date}.$$"
|
||||||
|
|
||||||
|
# give some info
|
||||||
|
_techo "Beginning to backup, this may take some time..."
|
||||||
|
|
||||||
|
_techo "Creating ${destination_dir} ..."
|
||||||
|
pcmd mkdir ${VVERBOSE} "${destination_dir}" || \
|
||||||
|
_exit_err "Creating ${destination_dir} failed. Skipping."
|
||||||
|
|
||||||
|
#
|
||||||
|
# added marking in 0.6 (and remove it, if successful later)
|
||||||
|
#
|
||||||
|
pcmd touch "${destination_dir}.${c_marker}"
|
||||||
|
|
||||||
|
#
|
||||||
|
# the rsync part
|
||||||
|
#
|
||||||
|
_techo "Transferring files..."
|
||||||
|
rsync "$@" "${source}" "${destination_full}"; ret=$?
|
||||||
|
# Correct the modification time:
|
||||||
|
pcmd touch "${destination_dir}"
|
||||||
|
|
||||||
|
#
|
||||||
|
# remove marking here
|
||||||
|
#
|
||||||
|
if [ "$ret" -ne 12 ] ; then
|
||||||
|
pcmd rm "${destination_dir}.${c_marker}" || \
|
||||||
|
_exit_err "Removing ${destination_dir}/${c_marker} failed."
|
||||||
|
fi
|
||||||
|
|
||||||
|
_techo "Finished backup (rsync return code: $ret)."
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_techo "Warning: rsync exited non-zero, the backup may be broken (see rsync errors)."
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# post_exec
|
||||||
|
#
|
||||||
|
if [ -x "${c_post_exec}" ]; then
|
||||||
|
_techo "Executing ${c_post_exec} ..."
|
||||||
|
"${c_post_exec}"; ret=$?
|
||||||
|
_techo "Finished ${c_post_exec}."
|
||||||
|
|
||||||
|
if [ ${ret} -ne 0 ]; then
|
||||||
|
_exit_err "${c_post_exec} failed."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Calculation
|
||||||
|
end_s=$(date +%s)
|
||||||
|
|
||||||
|
full_seconds=$((${end_s} - ${begin_s}))
|
||||||
|
hours=$((${full_seconds} / 3600))
|
||||||
|
seconds=$((${full_seconds} - (${hours} * 3600)))
|
||||||
|
minutes=$((${seconds} / 60))
|
||||||
|
seconds=$((${seconds} - (${minutes} * 60)))
|
||||||
|
|
||||||
|
_techo "Backup lasted: ${hours}:${minutes}:${seconds} (h:m:s)"
|
||||||
|
|
||||||
|
) | add_name
|
||||||
|
done
|
||||||
|
|
||||||
|
#
|
||||||
|
# Be a good parent and wait for our children, if they are running wild parallel
|
||||||
|
#
|
||||||
|
if [ "${PARALLEL}" ]; then
|
||||||
|
_techo "Waiting for children to complete..."
|
||||||
|
wait
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
# Look for post-exec command (general)
|
||||||
|
#
|
||||||
|
if [ -x "${CPOSTEXEC}" ]; then
|
||||||
|
_techo "Executing ${CPOSTEXEC} ..."
|
||||||
|
"${CPOSTEXEC}"; ret=$?
|
||||||
|
_techo "Finished ${CPOSTEXEC} (return code: ${ret})."
|
||||||
|
|
||||||
|
if [ ${ret} -ne 0 ]; then
|
||||||
|
_techo "${CPOSTEXEC} failed."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
rm -f "${TMP}"
|
||||||
|
_techo "Finished ${WE}"
|
||||||
|
|
||||||
|
# vim: set shiftwidth=3 tabstop=3 expandtab :
|
||||||
17
contrib/jlawless-2009-06-03/old/d.patch
Normal file
17
contrib/jlawless-2009-06-03/old/d.patch
Normal file
|
|
@ -0,0 +1,17 @@
|
||||||
|
--- ccollect-0.7.1-c.sh 2009-05-24 21:39:43.000000000 -0700
|
||||||
|
+++ ccollect-0.7.1-d.sh 2009-05-24 21:47:09.000000000 -0700
|
||||||
|
@@ -492,12 +492,12 @@
|
||||||
|
if [ "${count}" -ge "${c_interval}" ]; then
|
||||||
|
substract=$((${c_interval} - 1))
|
||||||
|
remove=$((${count} - ${substract}))
|
||||||
|
_techo "Removing ${remove} backup(s)..."
|
||||||
|
|
||||||
|
- pcmd ls -p1 "$ddir" | grep "^${INTERVAL}\..*/\$" | \
|
||||||
|
- sort -n | head -n "${remove}" > "${TMP}" || \
|
||||||
|
+ pcmd ls -${TSORT}p1r "$ddir" | grep "^${INTERVAL}\..*/\$" | \
|
||||||
|
+ head -n "${remove}" > "${TMP}" || \
|
||||||
|
_exit_err "Listing old backups failed"
|
||||||
|
|
||||||
|
i=0
|
||||||
|
while read to_remove; do
|
||||||
|
eval remove_$i=\"${to_remove}\"
|
||||||
19
contrib/jlawless-2009-06-03/old/e.patch
Normal file
19
contrib/jlawless-2009-06-03/old/e.patch
Normal file
|
|
@ -0,0 +1,19 @@
|
||||||
|
--- ccollect-0.7.1-d.sh 2009-05-24 21:47:09.000000000 -0700
|
||||||
|
+++ ccollect-0.7.1-e.sh 2009-05-24 22:18:16.000000000 -0700
|
||||||
|
@@ -560,12 +560,14 @@
|
||||||
|
pcmd touch "${destination_dir}"
|
||||||
|
|
||||||
|
#
|
||||||
|
# remove marking here
|
||||||
|
#
|
||||||
|
- pcmd rm "${destination_dir}.${c_marker}" || \
|
||||||
|
- _exit_err "Removing ${destination_dir}/${c_marker} failed."
|
||||||
|
+ if [ "$ret" -ne 12 ] ; then
|
||||||
|
+ pcmd rm "${destination_dir}.${c_marker}" || \
|
||||||
|
+ _exit_err "Removing ${destination_dir}/${c_marker} failed."
|
||||||
|
+ fi
|
||||||
|
|
||||||
|
_techo "Finished backup (rsync return code: $ret)."
|
||||||
|
if [ "${ret}" -ne 0 ]; then
|
||||||
|
_techo "Warning: rsync exited non-zero, the backup may be broken (see rsync errors)."
|
||||||
|
fi
|
||||||
119
contrib/jlawless-2009-06-03/old/f.patch
Normal file
119
contrib/jlawless-2009-06-03/old/f.patch
Normal file
|
|
@ -0,0 +1,119 @@
|
||||||
|
--- ccollect-0.7.1-e.sh 2009-05-24 22:18:16.000000000 -0700
|
||||||
|
+++ ccollect-0.7.1-f.sh 2009-05-24 22:19:50.000000000 -0700
|
||||||
|
@@ -124,10 +124,64 @@
|
||||||
|
echo " Retrieve latest ccollect at http://unix.schottelius.org/ccollect/"
|
||||||
|
exit 0
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
+# Select interval if AUTO
|
||||||
|
+#
|
||||||
|
+# For this to work nicely, you have to choose interval names that sort nicely
|
||||||
|
+# such as int1, int2, int3 or a_daily, b_weekly, c_monthly, etc.
|
||||||
|
+#
|
||||||
|
+auto_interval()
|
||||||
|
+{
|
||||||
|
+ if [ -d "${backup}/intervals" -a -n "$(ls "${backup}/intervals" 2>/dev/null)" ] ; then
|
||||||
|
+ intervals_dir="${backup}/intervals"
|
||||||
|
+ elif [ -d "${CDEFAULTS}/intervals" -a -n "$(ls "${CDEFAULTS}/intervals" 2>/dev/null)" ] ; then
|
||||||
|
+ intervals_dir="${CDEFAULTS}/intervals"
|
||||||
|
+ else
|
||||||
|
+ _exit_err "No intervals are defined. Skipping."
|
||||||
|
+ fi
|
||||||
|
+ echo intervals_dir=${intervals_dir}
|
||||||
|
+
|
||||||
|
+ trial_interval="$(ls -1r "${intervals_dir}/" | head -n 1)" || \
|
||||||
|
+ _exit_err "Failed to list contents of ${intervals_dir}/."
|
||||||
|
+ _techo "Considering interval ${trial_interval}"
|
||||||
|
+ most_recent="$(pcmd ls -${TSORT}p1 "${ddir}" | grep "^${trial_interval}.*/$" | head -n 1)" || \
|
||||||
|
+ _exit_err "Failed to list contents of ${ddir}/."
|
||||||
|
+ _techo " Most recent ${trial_interval}: '${most_recent}'"
|
||||||
|
+ if [ -n "${most_recent}" ]; then
|
||||||
|
+ no_intervals="$(ls -1 "${intervals_dir}/" | wc -l)"
|
||||||
|
+ n=1
|
||||||
|
+ while [ "${n}" -le "${no_intervals}" ]; do
|
||||||
|
+ trial_interval="$(ls -p1 "${intervals_dir}/" | tail -n+${n} | head -n 1)"
|
||||||
|
+ _techo "Considering interval '${trial_interval}'"
|
||||||
|
+ c_interval="$(cat "${intervals_dir}/${trial_interval}" 2>/dev/null)"
|
||||||
|
+ m=$((${n}+1))
|
||||||
|
+ set -- "${ddir}" -maxdepth 1
|
||||||
|
+ while [ "${m}" -le "${no_intervals}" ]; do
|
||||||
|
+ interval_m="$(ls -1 "${intervals_dir}/" | tail -n+${m} | head -n 1)"
|
||||||
|
+ most_recent="$(pcmd ls -${TSORT}p1 "${ddir}" | grep "^${interval_m}\..*/$" | head -n 1)"
|
||||||
|
+ _techo " Most recent ${interval_m}: '${most_recent}'"
|
||||||
|
+ if [ -n "${most_recent}" ] ; then
|
||||||
|
+ set -- "$@" -$NEWER "${ddir}/${most_recent}"
|
||||||
|
+ fi
|
||||||
|
+ m=$((${m}+1))
|
||||||
|
+ done
|
||||||
|
+ count=$(pcmd find "$@" -iname "${trial_interval}*" | wc -l)
|
||||||
|
+ _techo " Found $count more recent backups of ${trial_interval} (limit: ${c_interval})"
|
||||||
|
+ if [ "$count" -lt "${c_interval}" ] ; then
|
||||||
|
+ break
|
||||||
|
+ fi
|
||||||
|
+ n=$((${n}+1))
|
||||||
|
+ done
|
||||||
|
+ fi
|
||||||
|
+ export INTERVAL="${trial_interval}"
|
||||||
|
+ D_FILE_INTERVAL="${intervals_dir}/${INTERVAL}"
|
||||||
|
+ D_INTERVAL=$(cat "${D_FILE_INTERVAL}" 2>/dev/null)
|
||||||
|
+}
|
||||||
|
+
|
||||||
|
+#
|
||||||
|
# need at least interval and one source or --all
|
||||||
|
#
|
||||||
|
if [ $# -lt 2 ]; then
|
||||||
|
if [ "$1" = "-V" -o "$1" = "--version" ]; then
|
||||||
|
display_version
|
||||||
|
@@ -344,12 +398,28 @@
|
||||||
|
_exit_err "${c_pre_exec} failed. Skipping."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
#
|
||||||
|
+ # Destination is a path
|
||||||
|
+ #
|
||||||
|
+ if [ ! -f "${c_dest}" ]; then
|
||||||
|
+ _exit_err "Destination ${c_dest} is not a file. Skipping."
|
||||||
|
+ else
|
||||||
|
+ ddir=$(cat "${c_dest}"); ret="$?"
|
||||||
|
+ if [ "${ret}" -ne 0 ]; then
|
||||||
|
+ _exit_err "Destination ${c_dest} is not readable. Skipping."
|
||||||
|
+ fi
|
||||||
|
+ fi
|
||||||
|
+
|
||||||
|
+ #
|
||||||
|
# interval definition: First try source specific, fallback to default
|
||||||
|
#
|
||||||
|
+ if [ ${INTERVAL} = "AUTO" ] ; then
|
||||||
|
+ auto_interval
|
||||||
|
+ _techo "Selected interval: '$INTERVAL'"
|
||||||
|
+ fi
|
||||||
|
c_interval="$(cat "${backup}/intervals/${INTERVAL}" 2>/dev/null)"
|
||||||
|
|
||||||
|
if [ -z "${c_interval}" ]; then
|
||||||
|
c_interval="${D_INTERVAL}"
|
||||||
|
|
||||||
|
@@ -371,22 +441,10 @@
|
||||||
|
fi
|
||||||
|
# Verify source is up and accepting connections before deleting any old backups
|
||||||
|
rsync "$source" >/dev/null || _exit_err "Source ${source} is not readable. Skipping."
|
||||||
|
|
||||||
|
#
|
||||||
|
- # Destination is a path
|
||||||
|
- #
|
||||||
|
- if [ ! -f "${c_dest}" ]; then
|
||||||
|
- _exit_err "Destination ${c_dest} is not a file. Skipping."
|
||||||
|
- else
|
||||||
|
- ddir=$(cat "${c_dest}"); ret="$?"
|
||||||
|
- if [ "${ret}" -ne 0 ]; then
|
||||||
|
- _exit_err "Destination ${c_dest} is not readable. Skipping."
|
||||||
|
- fi
|
||||||
|
- fi
|
||||||
|
-
|
||||||
|
- #
|
||||||
|
# do we backup to a remote host? then set pre-cmd
|
||||||
|
#
|
||||||
|
if [ -f "${c_remote_host}" ]; then
|
||||||
|
# adjust ls and co
|
||||||
|
remote_host=$(cat "${c_remote_host}"); ret="$?"
|
||||||
|
|
@ -0,0 +1,14 @@
|
||||||
|
31c31,41
|
||||||
|
< logdir="${LOGCONF}/destination"
|
||||||
|
---
|
||||||
|
> c_dest="${LOGCONF}/destination"
|
||||||
|
>
|
||||||
|
> if [ ! -f ${c_dest} ]; then
|
||||||
|
> _exit_err "Destination ${c_dest} is not a file. Skipping."
|
||||||
|
> else
|
||||||
|
> logdir=$(cat "${c_dest}"); ret="$?"
|
||||||
|
> if [ "${ret}" -ne 0 ]; then
|
||||||
|
> _exit_err "Destination ${c_dest} is not readable. Skipping."
|
||||||
|
> fi
|
||||||
|
> fi
|
||||||
|
>
|
||||||
|
|
@ -3,7 +3,7 @@
|
||||||
# 2007 Daniel Aubry
|
# 2007 Daniel Aubry
|
||||||
# 2008 Nico Schottelius (added minimal header)
|
# 2008 Nico Schottelius (added minimal header)
|
||||||
#
|
#
|
||||||
# Copying license unknown
|
# Copying license: GPL2-only
|
||||||
#
|
#
|
||||||
|
|
||||||
# TODO:
|
# TODO:
|
||||||
|
|
@ -16,8 +16,9 @@ then
|
||||||
# changes after license clearify
|
# changes after license clearify
|
||||||
# for dest in /etc/ccollect/sources/ -type f -name destination | while read line
|
# for dest in /etc/ccollect/sources/ -type f -name destination | while read line
|
||||||
|
|
||||||
find /etc/ccollect/sources/ -type l | while read line
|
find /etc/ccollect/sources/*/destination | while read line
|
||||||
d=$(basename $(readlink $line))
|
do
|
||||||
|
d=$(basename $(cat $line))
|
||||||
echo "====[Backup: $backupname]====" | tee -a /var/log/backup.log
|
echo "====[Backup: $backupname]====" | tee -a /var/log/backup.log
|
||||||
du -sh $line/* | tee -a /var/log/backup.log
|
du -sh $line/* | tee -a /var/log/backup.log
|
||||||
done
|
done
|
||||||
72
contrib/thorsten_start_ccollect/start_ccollect
Normal file
72
contrib/thorsten_start_ccollect/start_ccollect
Normal file
|
|
@ -0,0 +1,72 @@
|
||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# Backup-Ordner
|
||||||
|
BACKUP_DIR="/mnt"
|
||||||
|
|
||||||
|
# ccollect_logwrapper-Skript
|
||||||
|
CCOLLECT_LOGWRAPPER="./ccollect_logwrapper.sh"
|
||||||
|
|
||||||
|
# letzte Sicherung für Gruppe daily, weekly und monthly in Backup-Ordner ermitteln
|
||||||
|
DATE_DAILY=` ls $BACKUP_DIR | grep daily | sort -r | sed -e'2,$d' | cut -f 2 -d.`
|
||||||
|
DATE_WEEKLY=` ls $BACKUP_DIR | grep weekly | sort -r | sed -e'2,$d' | cut -f 2 -d.`
|
||||||
|
DATE_MONTHLY=`ls $BACKUP_DIR | grep monthly | sort -r | sed -e'2,$d' | cut -f 2 -d.`
|
||||||
|
DATE_YEARLY=` ls $BACKUP_DIR | grep yearly | sort -r | sed -e'2,$d' | cut -f 2 -d.`
|
||||||
|
|
||||||
|
# Falls Leerstring diesen mit "altem Datum" füllen
|
||||||
|
if [ -z "$DATE_DAILY" ] ; then DATE_DAILY="20000101-0101" ; fi
|
||||||
|
if [ -z "$DATE_WEEKLY" ] ; then DATE_WEEKLY="20000101-0101" ; fi
|
||||||
|
if [ -z "$DATE_MONTHLY" ] ; then DATE_MONTHLY="20000101-0101" ; fi
|
||||||
|
if [ -z "$DATE_YEARLY" ] ; then DATE_YEARLY="20000101-0101" ; fi
|
||||||
|
|
||||||
|
echo current: $DATE_CUR
|
||||||
|
echo last daily: $DATE_DAILY
|
||||||
|
echo last weekly: $DATE_WEEKLY
|
||||||
|
echo last monthly: $DATE_MONTHLY
|
||||||
|
echo last yearly: $DATE_YEARLY
|
||||||
|
|
||||||
|
# Datum date-konform wandeln
|
||||||
|
# Achtung: mit bash - nicht mit sh möglich!
|
||||||
|
# Alternativ mit expr... konvertieren
|
||||||
|
|
||||||
|
DATE_DAILY=${DATE_DAILY:0:4}-${DATE_DAILY:4:2}-${DATE_DAILY:6:2}" "${DATE_DAILY:9:2}:${DATE_DAILY:11:2}:00
|
||||||
|
DATE_WEEKLY=${DATE_WEEKLY:0:4}-${DATE_WEEKLY:4:2}-${DATE_WEEKLY:6:2}" "${DATE_WEEKLY:9:2}:${DATE_WEEKLY:11:2}:00
|
||||||
|
DATE_MONTHLY=${DATE_MONTHLY:0:4}-${DATE_MONTHLY:4:2}-${DATE_MONTHLY:6:2}" "${DATE_MONTHLY:9:2}:${DATE_MONTHLY:11:2}:00
|
||||||
|
DATE_YEARLY=${DATE_YEARLY:0:4}-${DATE_YEARLY:4:2}-${DATE_YEARLY:6:2}" "${DATE_YEARLY:9:2}:${DATE_YEARLY:11:2}:00
|
||||||
|
DATE_CUR=`date "+%Y-%m-%d %T"`
|
||||||
|
|
||||||
|
# Bei Bedarf Backups durchführen
|
||||||
|
|
||||||
|
if [ `date --date "$DATE_YEARLY" +%Y` -ne `date --date "$DATE_CUR" +%Y` ]
|
||||||
|
then
|
||||||
|
|
||||||
|
# Jahresbackup erzeugen
|
||||||
|
echo monthly backup started
|
||||||
|
source $CCOLLECT_LOGWRAPPER -a yearly
|
||||||
|
|
||||||
|
elif [ `date --date "$DATE_MONTHLY" +%Y%m` -ne `date --date "$DATE_CUR" +%Y%m` ]
|
||||||
|
then
|
||||||
|
|
||||||
|
# Monatsbackup erzeugen
|
||||||
|
echo monthly backup started
|
||||||
|
source $CCOLLECT_LOGWRAPPER -a monthly
|
||||||
|
|
||||||
|
elif [ `date --date "$DATE_WEEKLY" +%Y%W` -ne `date --date "$DATE_CUR" +%Y%W` ]
|
||||||
|
then
|
||||||
|
|
||||||
|
# Wochenbackup erzeugen
|
||||||
|
echo weekly backup started
|
||||||
|
source $CCOLLECT_LOGWRAPPER -a weekly
|
||||||
|
|
||||||
|
elif [ `date --date "$DATE_DAILY" +%Y%j` -ne `date --date "$DATE_CUR" +%Y%j` ]
|
||||||
|
then
|
||||||
|
|
||||||
|
# Tagesbackup erzeugen
|
||||||
|
echo daily backup started
|
||||||
|
source $CCOLLECT_LOGWRAPPER -a daily
|
||||||
|
|
||||||
|
else
|
||||||
|
|
||||||
|
# nichts zu tun
|
||||||
|
echo nothing to do
|
||||||
|
|
||||||
|
fi
|
||||||
37
doc/HACKING
Normal file
37
doc/HACKING
Normal file
|
|
@ -0,0 +1,37 @@
|
||||||
|
Hello Hacker,
|
||||||
|
|
||||||
|
I really appreciate your interest in hacking this software, but
|
||||||
|
I am kind of critical when seeing patches. Thus I created this
|
||||||
|
file to give you some hints of my thinking quirks.
|
||||||
|
|
||||||
|
|
||||||
|
Submitting patches
|
||||||
|
------------------
|
||||||
|
Make my life easier, make your life easier, use a version control system (vcs).
|
||||||
|
For this software the preferred vcs is git. Clone the latest repo, create
|
||||||
|
a new local branch (git checkout -b <branchname>) write down your ideas.
|
||||||
|
|
||||||
|
When you're done, push all your stuff out to some public repo and drop a
|
||||||
|
mail to the mailinglist, what you did and where to get it.
|
||||||
|
|
||||||
|
|
||||||
|
Introduce a feature or change behaviour
|
||||||
|
---------------------------------------
|
||||||
|
Uhh, fancy! You have had a great idea, then it's time to change
|
||||||
|
the major version, so others know that something changed.
|
||||||
|
|
||||||
|
If the configuration format is changed, add a script to tools/
|
||||||
|
to allow users upgrade their configuration to this major version.
|
||||||
|
|
||||||
|
And now comes the most difficult part: Add documentation. Nobody
|
||||||
|
benefits from your cool feature, if it is not known. I know, writing
|
||||||
|
documentation is not so much fun, but you also expect good documentation
|
||||||
|
for this software, don't you?
|
||||||
|
|
||||||
|
|
||||||
|
If you think my thinking quirks must be corrected
|
||||||
|
-------------------------------------------------
|
||||||
|
See above ("Submitting patches") and submit a patch to this file.
|
||||||
|
|
||||||
|
|
||||||
|
Thanks for reading.
|
||||||
|
|
@ -1,35 +0,0 @@
|
||||||
to Local to Remote
|
|
||||||
backup destination is exiting
|
|
||||||
pre/postexec runs locally
|
|
||||||
--link-dest?
|
|
||||||
/delete_incomplete - can chech ddir
|
|
||||||
|
|
||||||
can check destination dir
|
|
||||||
-> dooooooo it before!
|
|
||||||
|
|
||||||
|
|
||||||
remote_host!
|
|
||||||
=> rddir_ls:
|
|
||||||
incomplete: ls -1 "${INTERVAL}"*".${c_marker}"
|
|
||||||
|
|
||||||
host support?
|
|
||||||
ssh-host-support?
|
|
||||||
|
|
||||||
=> ssh_host => save to host
|
|
||||||
execute commands there!
|
|
||||||
|
|
||||||
rm!
|
|
||||||
|
|
||||||
--link-dest?
|
|
||||||
|
|
||||||
--link-dest=DIR
|
|
||||||
=> remote dirs, rsync remote
|
|
||||||
=> works!!!!
|
|
||||||
|
|
||||||
local_destination
|
|
||||||
remote_destination
|
|
||||||
=> remote_*
|
|
||||||
|
|
||||||
both
|
|
||||||
configuration is local (what to where)
|
|
||||||
|
|
||||||
|
|
@ -1 +0,0 @@
|
||||||
Do not read the files in this directory
|
|
||||||
196
doc/ccollect-restoring.text
Normal file
196
doc/ccollect-restoring.text
Normal file
|
|
@ -0,0 +1,196 @@
|
||||||
|
ccollect - Restoring backups
|
||||||
|
============================
|
||||||
|
Nico Schottelius <nico-ccollect__@__schottelius.org>
|
||||||
|
0.1, for all ccollect version, Initial Version from 2008-07-04
|
||||||
|
:Author Initials: NS
|
||||||
|
|
||||||
|
|
||||||
|
Having backups is half the way to success on a failure.
|
||||||
|
Knowing how to restore the systems is the other half.
|
||||||
|
|
||||||
|
|
||||||
|
Introduction
|
||||||
|
------------
|
||||||
|
You made your backup and now you want to restore your
|
||||||
|
data. If you backuped only parts of a computer and need
|
||||||
|
only to restore them, it is pretty easy to achieve.
|
||||||
|
Restoring a whole system is a little bit more
|
||||||
|
difficult and needs some knowledge of the operating system.
|
||||||
|
|
||||||
|
|
||||||
|
Restoring parts of a system
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
Log into your backupserver. Change into the
|
||||||
|
backup directory you want to restore from.
|
||||||
|
Do `rsync -av './files/to/be/recovered/' 'sourcehost:/files/to/be/recovered/'.
|
||||||
|
|
||||||
|
Restoring a complete system (general)
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
Boot the system to be rescued from a media that contains low level tools
|
||||||
|
for your OS (like partitioning, formatting) and the necessary tools
|
||||||
|
(ssh, tar or rsync).
|
||||||
|
Use
|
||||||
|
- create the necessary volumes (like partitions, slices, ...)
|
||||||
|
|
||||||
|
Get a live-cd, that ships with
|
||||||
|
- rsync / tar
|
||||||
|
- ssh (d) -> from backupserver
|
||||||
|
- support for the filesystems
|
||||||
|
|
||||||
|
Restoring a complete FreeBSD system
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
Get a FreeBSD-live-cd (I used the FreeBSD 7.0 live CD,
|
||||||
|
but FreeSBIE (http://www.freesbie.org/),
|
||||||
|
Frenzy (http://frenzy.org.ua/en/) or the
|
||||||
|
FreeBSD LiveCD (http://livecd.sourceforge.net/)
|
||||||
|
may also be helpful. The following way uses the FreeBSD 7.0
|
||||||
|
live cd.
|
||||||
|
|
||||||
|
So boot it up, select your language. After that select
|
||||||
|
*Custom* then *Partition*. Create the slice like you want
|
||||||
|
to have it. Then let the installer write into the MBR,
|
||||||
|
select *BootMgr*.
|
||||||
|
|
||||||
|
After that create the necessary labels, select *Label* and
|
||||||
|
make sure "Newfs" flag is set to "Y".
|
||||||
|
|
||||||
|
Finally, select *Commit* and choose an installation type
|
||||||
|
that must fail, because we want the installer only to write
|
||||||
|
the partitions and labels, but not to install anything on it.
|
||||||
|
|
||||||
|
At this point we have created the base for restoring the whole
|
||||||
|
system. Move back to the main menu and select *Fixit*, then
|
||||||
|
*CDROM/DVD*. This starts a shell on TTY4, which can be reached
|
||||||
|
by pressing *ALT+F4*. Then enter the following data:
|
||||||
|
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
rootdir=/ccollect
|
||||||
|
rootdev=/dev/ad0s1a
|
||||||
|
backupserver=192.42.23.5
|
||||||
|
|
||||||
|
# create destination directory
|
||||||
|
mkdir "$rootdir"
|
||||||
|
|
||||||
|
# mount root; add other mounts if you created more labels
|
||||||
|
mount "$rootdev" "$rootdir"
|
||||||
|
|
||||||
|
# find out which network devices exist
|
||||||
|
ifconfig
|
||||||
|
|
||||||
|
# create the directory, because dhclient needs it
|
||||||
|
mkdir /var/db
|
||||||
|
|
||||||
|
# retrieve an ip address
|
||||||
|
dhclient fxp0
|
||||||
|
|
||||||
|
# test connection
|
||||||
|
ssh "$backupserver"
|
||||||
|
|
||||||
|
# go back
|
||||||
|
backupserver% exit
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
|
||||||
|
Now we've prepared everything for the real backup. The next problem maybe,
|
||||||
|
that we cannot (should not) be able to login as root to the backup server.
|
||||||
|
Additionally the system to be restored may not reachable from the backup server,
|
||||||
|
because it is behind a firewall or nat.
|
||||||
|
Thus I describe a way, that is a little bit more complicated for those, that
|
||||||
|
do not have these limitations, but works in both scenarios.
|
||||||
|
|
||||||
|
I just start netcat on the local machine, pipe its output to tar and put
|
||||||
|
both into the background. Then I create a ssh tunnel to the backupserver,
|
||||||
|
which is then able to connect to my netcat "directly".
|
||||||
|
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
# user to connect to the backupserver
|
||||||
|
myuser=nico
|
||||||
|
|
||||||
|
# our name in the backup
|
||||||
|
restorehost=server1
|
||||||
|
|
||||||
|
# the instance to be used
|
||||||
|
backup="weekly.20080718-2327.23053"
|
||||||
|
|
||||||
|
# Need to setup lo0 first, the livecd did not do it for me
|
||||||
|
ifconfig lo0 127.0.0.1 up
|
||||||
|
|
||||||
|
# change to the destination directory
|
||||||
|
cd "$rootdir"
|
||||||
|
|
||||||
|
# start listener
|
||||||
|
( nc -l 127.0.0.1 4242 | tar xvf - ) &
|
||||||
|
|
||||||
|
# verify that it runs correctly
|
||||||
|
sockstat -4l
|
||||||
|
|
||||||
|
# connect as a normal user to the backupserver
|
||||||
|
ssh -R4242:127.0.0.1:4242 "$myuser@$backupserver"
|
||||||
|
|
||||||
|
# become root
|
||||||
|
backupserver% su -
|
||||||
|
|
||||||
|
# change to the source directory
|
||||||
|
backupserver# cd /home/server/backup/$restorehost/$backup
|
||||||
|
|
||||||
|
# begin the backup
|
||||||
|
backup # tar cf - . | nc 127.0.0.1 4242
|
||||||
|
|
||||||
|
# wait until it finishes, press ctrl-c to kill netcat
|
||||||
|
# logoff the backupserver
|
||||||
|
backupserver# exit
|
||||||
|
backupserver% exit
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
|
||||||
|
Now we are just right next to be finished. Still, we have to take care about
|
||||||
|
some things:
|
||||||
|
|
||||||
|
- Do the block devices still have the same names? If not, correct /etc/fstab.
|
||||||
|
- Do the network devices still have the same names? If not, correct /etc/rc.conf.
|
||||||
|
|
||||||
|
If everything is fixed, let us finish the restore:
|
||||||
|
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
# cleanly umount it
|
||||||
|
umount "$rootdir"
|
||||||
|
|
||||||
|
# reboot, remove the cd and bootup the restored system
|
||||||
|
reboot
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
|
||||||
|
Restoring a complete Linux system
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
Knoppix
|
||||||
|
knoppix 2 at boot prompt
|
||||||
|
|
||||||
|
rootdir=/ccollect
|
||||||
|
dev=/dev/hda
|
||||||
|
rootdev="${dev}1"
|
||||||
|
fs=jfs
|
||||||
|
tar
|
||||||
|
|
||||||
|
# create the needed partitions
|
||||||
|
cfdisk $dev
|
||||||
|
|
||||||
|
mkfs.$fs $rootdev
|
||||||
|
|
||||||
|
mkdir $rootdir
|
||||||
|
|
||||||
|
mount $rootdev $rootdir
|
||||||
|
|
||||||
|
cd $rootdir
|
||||||
|
|
||||||
|
pump
|
||||||
|
ifconfig
|
||||||
|
|
||||||
|
# start listener (from now on it is the same as
|
||||||
|
( nc -l 127.0.0.1 4242 | tar xvf - ) &
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
TO BE DONE
|
||||||
|
|
||||||
|
Future
|
||||||
|
------
|
||||||
|
I think about automating full system recoveries in the future.
|
||||||
|
I think it could be easily done and here are some hints for
|
||||||
|
people who would like to implement it.
|
||||||
|
|
@ -1,7 +1,7 @@
|
||||||
ccollect - Installing, Configuring and Using
|
ccollect - Installing, Configuring and Using
|
||||||
============================================
|
============================================
|
||||||
Nico Schottelius <nico-ccollect__@__schottelius.org>
|
Nico Schottelius <nico-ccollect__@__schottelius.org>
|
||||||
0.7, for ccollect 0.7.0, Initial Version from 2006-01-13
|
2.10, for ccollect 2.10, Initial Version from 2006-01-13
|
||||||
:Author Initials: NS
|
:Author Initials: NS
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -21,9 +21,12 @@ Supported and tested operating systems and architectures
|
||||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
`ccollect` was successfully tested on the following platforms:
|
`ccollect` was successfully tested on the following platforms:
|
||||||
|
|
||||||
- GNU/Linux on amd64/hppa/i386/ppc
|
- FreeBSD on amd64/i386
|
||||||
- NetBSD on amd64/i386/sparc/sparc64
|
- GNU/Linux on amd64/arm/hppa/i386/ppc
|
||||||
|
- Mac OS X 10.5
|
||||||
|
- NetBSD on alpha/amd64/i386/sparc/sparc64
|
||||||
- OpenBSD on amd64
|
- OpenBSD on amd64
|
||||||
|
- Windows by installing Cygwin, OpenSSH and rsync
|
||||||
|
|
||||||
It *should* run on any Unix that supports `rsync` and has a POSIX-compatible
|
It *should* run on any Unix that supports `rsync` and has a POSIX-compatible
|
||||||
bourne shell. If your platform is not listed above and you have it successfully
|
bourne shell. If your platform is not listed above and you have it successfully
|
||||||
|
|
@ -34,7 +37,7 @@ Why you COULD only backup from remote hosts, not to them
|
||||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
While considering the design of ccollect, I thought about enabling
|
While considering the design of ccollect, I thought about enabling
|
||||||
backup to *remote* hosts. Though this sounds like a nice feature
|
backup to *remote* hosts. Though this sounds like a nice feature
|
||||||
('Backup my notebook to the server now.'), in my opinion it is a
|
('"Backup my notebook to the server now."'), in my opinion it is a
|
||||||
bad idea to backup to a remote host.
|
bad idea to backup to a remote host.
|
||||||
|
|
||||||
But as more and more people requested this feature, it was implemented,
|
But as more and more people requested this feature, it was implemented,
|
||||||
|
|
@ -66,12 +69,41 @@ machine, she will not be able to log in on the backup machine.
|
||||||
All other backups are still secure.
|
All other backups are still secure.
|
||||||
|
|
||||||
|
|
||||||
Incompatibilities
|
Incompatibilities and changes
|
||||||
~~~~~~~~~~~~~~~~~
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
|
||||||
|
Versions 0.9 and 1.0
|
||||||
|
^^^^^^^^^^^^^^^^^^^^
|
||||||
|
- Added "Error: " prefix in _exit_err()
|
||||||
|
|
||||||
|
Versions 0.8 and 0.9
|
||||||
|
^^^^^^^^^^^^^^^^^^^^
|
||||||
|
- Renamed script to ccollect (.sh is not needed)
|
||||||
|
- Removed feature to backup to a host via ccollect, added new tool
|
||||||
|
(FIXME: insert name here) that takes care of this via tunnel
|
||||||
|
- Perhaps creating subdirectory of source name (idea from Stefan Schlörholz)
|
||||||
|
|
||||||
|
Versions 0.7 and 0.8
|
||||||
|
^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
.The argument order changed:
|
||||||
|
- Old: "<interval name> [args] <sources to backup>"
|
||||||
|
- New: "[args] <interval name> <sources to backup>"
|
||||||
|
|
||||||
|
If you did not use arguments (most people do not), nothing will
|
||||||
|
change for you.
|
||||||
|
|
||||||
|
.Deletion of incomplete backups using the 'delete_incomplete' option
|
||||||
|
- Old: Only incomplete backups from the current interval have been removed
|
||||||
|
- New: All incomplete backups are deleted
|
||||||
|
|
||||||
|
.Support for standard values
|
||||||
|
- Old: no support
|
||||||
|
- New: Options in $CCOLLECT_CONF/defaults are used as defaults (see below)
|
||||||
|
|
||||||
Versions 0.6 and 0.7
|
Versions 0.6 and 0.7
|
||||||
^^^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^^
|
||||||
.The format of `destination` changed:
|
.The format of `destination` changed:
|
||||||
- Before 0.7 it was a (link to a) directory
|
- Before 0.7 it was a (link to a) directory
|
||||||
- As of 0.7 it is a textfile containing the destination
|
- As of 0.7 it is a textfile containing the destination
|
||||||
|
|
@ -83,7 +115,7 @@ You can update your configuration using `tools/config-pre-0.7-to-0.7.sh`.
|
||||||
|
|
||||||
|
|
||||||
Versions 0.5 and 0.6
|
Versions 0.5 and 0.6
|
||||||
^^^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^^
|
||||||
.The format of `rsync_options` changed:
|
.The format of `rsync_options` changed:
|
||||||
- Before 0.6 it was whitespace delimeted
|
- Before 0.6 it was whitespace delimeted
|
||||||
- As of 0.6 it is newline seperated (so you can pass whitespaces to `rsync`)
|
- As of 0.6 it is newline seperated (so you can pass whitespaces to `rsync`)
|
||||||
|
|
@ -100,7 +132,7 @@ XXXXX (- comes before digit).
|
||||||
|
|
||||||
|
|
||||||
Versions 0.4 and 0.5
|
Versions 0.4 and 0.5
|
||||||
^^^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^^
|
||||||
Not a real incompatibilty, but seems to fit in this section:
|
Not a real incompatibilty, but seems to fit in this section:
|
||||||
|
|
||||||
.0.5 does *NOT* require
|
.0.5 does *NOT* require
|
||||||
|
|
@ -137,15 +169,15 @@ Quick start
|
||||||
For those who do not want to read the whole long document:
|
For those who do not want to read the whole long document:
|
||||||
|
|
||||||
--------------------------------------------------------------------------------
|
--------------------------------------------------------------------------------
|
||||||
# get latest ccollect tarball from http://unix.schottelius.org/ccollect/
|
# get latest ccollect tarball from http://www.nico.schottelius.org/software/ccollect/
|
||||||
# replace value for CCV with the current version
|
# replace value for CCV with the current version
|
||||||
export CCV=0.7.0
|
export CCV=0.8.1
|
||||||
|
|
||||||
#
|
#
|
||||||
# replace 'wget' with 'fetch' on bsd
|
# replace 'wget' with 'fetch' on bsd
|
||||||
#
|
#
|
||||||
holen=wget
|
holen=wget
|
||||||
"$holen" http://unix.schottelius.org/ccollect/ccollect-${CCV}.tar.bz2
|
"$holen" http://www.nico.schottelius.org/software/ccollect/ccollect-${CCV}.tar.bz2
|
||||||
|
|
||||||
# extract the tarball, change to the newly created directory
|
# extract the tarball, change to the newly created directory
|
||||||
tar -xvjf ccollect-${CCV}.tar.bz2
|
tar -xvjf ccollect-${CCV}.tar.bz2
|
||||||
|
|
@ -210,7 +242,7 @@ Installing ccollect
|
||||||
~~~~~~~~~~~~~~~~~~~
|
~~~~~~~~~~~~~~~~~~~
|
||||||
For the installation you need at least
|
For the installation you need at least
|
||||||
|
|
||||||
- the latest ccollect package (http://unix.schottelius.org/ccollect/)
|
- the latest ccollect package (http://www.nico.schottelius.org/software/ccollect/)
|
||||||
- either `cp` and `chmod` or `install`
|
- either `cp` and `chmod` or `install`
|
||||||
- for more comfort: `make`
|
- for more comfort: `make`
|
||||||
- for rebuilding the generated documentation: additionally `asciidoc`
|
- for rebuilding the generated documentation: additionally `asciidoc`
|
||||||
|
|
@ -227,9 +259,9 @@ Using ccollect
|
||||||
Installing
|
Installing
|
||||||
----------
|
----------
|
||||||
Either type 'make install' or simply copy it to a directory in your
|
Either type 'make install' or simply copy it to a directory in your
|
||||||
$PATH and execute 'chmod *0755* /path/to/ccollect.sh'. If you would
|
$PATH and execute 'chmod *0755* /path/to/ccollect.sh'. If you like
|
||||||
like to use the new management scripts (available since 0.6), copy
|
to use the new management scripts (available since 0.6), copy the
|
||||||
the following scripts to a directory in $PATH:
|
following scripts to a directory in $PATH:
|
||||||
|
|
||||||
- `tools/ccollect_add_source.sh`
|
- `tools/ccollect_add_source.sh`
|
||||||
- `tools/ccollect_analyse_logs.sh.sh`
|
- `tools/ccollect_analyse_logs.sh.sh`
|
||||||
|
|
@ -244,8 +276,10 @@ After having installed and used ccollect, report success using
|
||||||
Configuring
|
Configuring
|
||||||
-----------
|
-----------
|
||||||
For configuration aid have a look at the above mentioned tools, which can assist
|
For configuration aid have a look at the above mentioned tools, which can assist
|
||||||
you quite well. When you are successfully using `ccollect`, report success using
|
you quite well. When you are successfully using `ccollect`, I would be happy if
|
||||||
`tools/report_success.sh`.
|
you add a link to your website, stating "I backup with ccollect", which points
|
||||||
|
to the ccollect homepage. So more people now about ccollect, use it and
|
||||||
|
improve it. You can also report success using `tools/report_success.sh`.
|
||||||
|
|
||||||
|
|
||||||
Runtime options
|
Runtime options
|
||||||
|
|
@ -253,7 +287,7 @@ Runtime options
|
||||||
`ccollect` looks for its configuration in '/etc/ccollect' or, if set, in
|
`ccollect` looks for its configuration in '/etc/ccollect' or, if set, in
|
||||||
the directory specified by the variable '$CCOLLECT_CONF':
|
the directory specified by the variable '$CCOLLECT_CONF':
|
||||||
--------------------------------------------------------------------------------
|
--------------------------------------------------------------------------------
|
||||||
# sh-compatible (zsh, mksh, ksh, bash, ...)
|
# sh-compatible (dash, zsh, mksh, ksh, bash, ...)
|
||||||
$ CCOLLECT_CONF=/your/config/dir ccollect.sh ...
|
$ CCOLLECT_CONF=/your/config/dir ccollect.sh ...
|
||||||
|
|
||||||
# csh
|
# csh
|
||||||
|
|
@ -277,7 +311,8 @@ The general configuration can be found in $CCOLLECT_CONF/defaults or
|
||||||
all source definitions, although the values can be overwritten in the source
|
all source definitions, although the values can be overwritten in the source
|
||||||
configuration.
|
configuration.
|
||||||
|
|
||||||
All configuration entries are plain-text files (use UTF-8 for non-ascii characters).
|
All configuration entries are plain-text files
|
||||||
|
(use UTF-8 for non-ascii characters).
|
||||||
|
|
||||||
|
|
||||||
Interval definition
|
Interval definition
|
||||||
|
|
@ -312,6 +347,15 @@ If you add '$CCOLLECT_CONF/defaults/`pre_exec`' or
|
||||||
will start `pre_exec` before the whole backup process and
|
will start `pre_exec` before the whole backup process and
|
||||||
`post_exec` after backup of all sources is done.
|
`post_exec` after backup of all sources is done.
|
||||||
|
|
||||||
|
If `pre_exec` exits with a non-zero return code, the whole backup
|
||||||
|
process will be aborted.
|
||||||
|
|
||||||
|
The `pre_exec` and `post_exec` script can access the following exported variables:
|
||||||
|
|
||||||
|
- 'INTERVAL': the interval selected (`daily`)
|
||||||
|
- 'no_sources': number of sources to backup (`2`)
|
||||||
|
- 'source_$no': name of the source, '$no' starts at 0 (`$source_0`)
|
||||||
|
|
||||||
The following example describes how to report free disk space in
|
The following example describes how to report free disk space in
|
||||||
human readable format before and after the whole backup process:
|
human readable format before and after the whole backup process:
|
||||||
-------------------------------------------------------------------------
|
-------------------------------------------------------------------------
|
||||||
|
|
@ -338,6 +382,9 @@ Each source contains at least the following files:
|
||||||
|
|
||||||
Additionally a source may have the following files:
|
Additionally a source may have the following files:
|
||||||
|
|
||||||
|
- `pre_exec` program to execute before backing up *this* source
|
||||||
|
- `post_exec` program to execute after backing up *this* source
|
||||||
|
|
||||||
- `verbose` whether to be verbose (passes -v to `rsync`)
|
- `verbose` whether to be verbose (passes -v to `rsync`)
|
||||||
- `very_verbose` be very verbose (`mkdir -v`, `rm -v` and `rsync -vv`)
|
- `very_verbose` be very verbose (`mkdir -v`, `rm -v` and `rsync -vv`)
|
||||||
- `summary` create a transfer summary when `rsync` finished
|
- `summary` create a transfer summary when `rsync` finished
|
||||||
|
|
@ -345,18 +392,18 @@ Additionally a source may have the following files:
|
||||||
- `exclude` exclude list for `rsync`. newline seperated list.
|
- `exclude` exclude list for `rsync`. newline seperated list.
|
||||||
- `rsync_options` extra options for `rsync`. newline seperated list.
|
- `rsync_options` extra options for `rsync`. newline seperated list.
|
||||||
|
|
||||||
- `pre_exec` program to execute before backing up *this* source
|
|
||||||
- `post_exec` program to execute after backing up *this* source
|
|
||||||
|
|
||||||
- `delete_incomplete` delete incomplete backups
|
- `delete_incomplete` delete incomplete backups
|
||||||
- `remote_host` host to backup to
|
- `remote_host` host to backup to
|
||||||
|
- `rsync_failure_codes` list of rsync exit codes that indicate complete failure
|
||||||
|
- `mtime` Sort backup directories based on their modification time
|
||||||
|
- `quiet_if_down` Suppress error messages if source is not connectable
|
||||||
|
|
||||||
|
|
||||||
Example:
|
Example:
|
||||||
--------------------------------------------------------------------------------
|
--------------------------------------------------------------------------------
|
||||||
[10:47] zaphodbeeblebrox:ccollect-0.2% ls -l conf/sources/testsource2
|
[10:47] zaphodbeeblebrox:ccollect-0.2% ls -l conf/sources/testsource2
|
||||||
insgesamt 12
|
insgesamt 12
|
||||||
lrwxrwxrwx 1 nico users 20 2005-11-17 16:44 destination -> /home/nico/backupdir
|
lrwxrwxrwx 1 nico users 20 2005-11-17 16:44 destination
|
||||||
-rw-r--r-- 1 nico users 62 2005-12-07 17:43 exclude
|
-rw-r--r-- 1 nico users 62 2005-12-07 17:43 exclude
|
||||||
drwxr-xr-x 2 nico users 4096 2005-12-07 17:38 intervals
|
drwxr-xr-x 2 nico users 4096 2005-12-07 17:38 intervals
|
||||||
-rw-r--r-- 1 nico users 15 2005-11-17 16:44 source
|
-rw-r--r-- 1 nico users 15 2005-11-17 16:44 source
|
||||||
|
|
@ -373,6 +420,37 @@ Example:
|
||||||
/home/nico/vpn
|
/home/nico/vpn
|
||||||
--------------------------------------------------------------------------------
|
--------------------------------------------------------------------------------
|
||||||
|
|
||||||
|
Default options
|
||||||
|
^^^^^^^^^^^^^^^
|
||||||
|
If you add '$CCOLLECT_CONF/defaults/`option_name`', the value will
|
||||||
|
be used in abscence of the option in a source. If you want to prevent
|
||||||
|
the default value to be used in a source, you can create the file
|
||||||
|
'$CCOLLECT_CONF/sources/$name/`no_option_name`' (i.e. prefix it with
|
||||||
|
'no_'.
|
||||||
|
|
||||||
|
Example:
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
[9:04] ikn2:ccollect% touch conf/defaults/verbose
|
||||||
|
[9:04] ikn2:ccollect% touch conf/sources/local/no_verbose
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
This enables the verbose option for all sources, but disables it for the
|
||||||
|
source 'local'.
|
||||||
|
|
||||||
|
If an option is specified in the defaults folder and in the source,
|
||||||
|
the source specific version overrides the default one:
|
||||||
|
|
||||||
|
Example:
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
[9:05] ikn2:ccollect% echo "backup-host" > conf/defaults/remote_host
|
||||||
|
[9:05] ikn2:ccollect% echo "different-host" > conf/sources/local/remote_host
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
|
||||||
|
You can use all source options as defaults, with the exception of
|
||||||
|
|
||||||
|
- `source`
|
||||||
|
- `destination`
|
||||||
|
- `pre_exec`
|
||||||
|
- `post_exec`
|
||||||
|
|
||||||
Detailed description of "source"
|
Detailed description of "source"
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
@ -404,6 +482,12 @@ Detailed description of "remote_host"
|
||||||
If this file is existing, you are backing up your data *TO* this host
|
If this file is existing, you are backing up your data *TO* this host
|
||||||
and *not* to you local host.
|
and *not* to you local host.
|
||||||
|
|
||||||
|
*Warning*: You need to have `ssh` access to the remote host. `rsync` and
|
||||||
|
`ccollect` will connect to that host via `ssh`. `ccollect` needs the shell
|
||||||
|
access, because it needs to find out how many backups exist on the remote
|
||||||
|
host and to be able to delete them.
|
||||||
|
|
||||||
|
|
||||||
Example:
|
Example:
|
||||||
--------------------------------------------------------------------------------
|
--------------------------------------------------------------------------------
|
||||||
[10:17] denkbrett:ccollect-0.7.0% cat conf/sources/remote1/remote_host
|
[10:17] denkbrett:ccollect-0.7.0% cat conf/sources/remote1/remote_host
|
||||||
|
|
@ -419,7 +503,7 @@ Detailed description of "verbose"
|
||||||
|
|
||||||
If this file exists in the source specification *-v* will be passed to `rsync`.
|
If this file exists in the source specification *-v* will be passed to `rsync`.
|
||||||
|
|
||||||
`
|
|
||||||
Example:
|
Example:
|
||||||
--------------------------------------------------------------------------------
|
--------------------------------------------------------------------------------
|
||||||
[11:35] zaphodbeeblebrox:ccollect-0.2% touch conf/sources/testsource1/verbose
|
[11:35] zaphodbeeblebrox:ccollect-0.2% touch conf/sources/testsource1/verbose
|
||||||
|
|
@ -536,6 +620,16 @@ respectively after doing the backup for *this specific* source.
|
||||||
If you want to have pre-/post-exec before and after *all*
|
If you want to have pre-/post-exec before and after *all*
|
||||||
backups, see above for general configuration.
|
backups, see above for general configuration.
|
||||||
|
|
||||||
|
If `pre_exec` exits with a non-zero return code, the backup
|
||||||
|
process of `this source` will be aborted (i.e. backup skipped).
|
||||||
|
|
||||||
|
The `post_exec` script can access the following exported variables from
|
||||||
|
ccollect:
|
||||||
|
|
||||||
|
- 'name': name of the source that is being backed up
|
||||||
|
- 'destination_name': contains the base directory name (`daily.20091031-1013.24496`)
|
||||||
|
- 'destination_dir': full path (`/tmp/ccollect/daily.20091031-1013.24496`)
|
||||||
|
- 'destination_full': like 'destination_dir', but prepended with the remote_host, if set (`host:/tmp/ccollect/daily.20091031-1013.24496` or `/tmp/ccollect/daily.20091031-1013.24496`)
|
||||||
|
|
||||||
Example:
|
Example:
|
||||||
--------------------------------------------------------------------------------
|
--------------------------------------------------------------------------------
|
||||||
|
|
@ -560,6 +654,39 @@ was interrupted) and remove them. Without this file `ccollect` will only warn
|
||||||
the user.
|
the user.
|
||||||
|
|
||||||
|
|
||||||
|
Detailed description of "rsync_failure_codes"
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
If you have the file `rsync_failure_codes` in your source configuration
|
||||||
|
directory, it should contain a newline-separated list of numbers representing
|
||||||
|
rsync exit codes. If rsync exits with any code in this list, a marker will
|
||||||
|
be left in the destination directory indicating failure of this backup. If
|
||||||
|
you have enabled delete_incomplete, then this backup will be deleted during
|
||||||
|
the next ccollect run on the same interval.
|
||||||
|
|
||||||
|
|
||||||
|
Detailed description of "mtime"
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
By default, ccollect.sh chooses the most recent backup directory for cloning or
|
||||||
|
the oldest for deletion based on the directory's last change time (ctime).
|
||||||
|
With this option, the sorting is done based on modification time (mtime). With
|
||||||
|
this version of ccollect, the ctime and mtime of your backups will normally
|
||||||
|
be the same and this option has no effect. However, if you, for example, move
|
||||||
|
your backups to another hard disk using cp -a or rsync -a, you should use this
|
||||||
|
option because the ctimes are not preserved during such operations.
|
||||||
|
|
||||||
|
If you have any backups in your repository made with ccollect version 0.7.1 or
|
||||||
|
earlier, do not use this option.
|
||||||
|
|
||||||
|
|
||||||
|
Detailed description of "quiet_if_down"
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
By default, ccollect.sh emits a series of error messages if a source is not
|
||||||
|
connectable. With this option enabled, ccollect still reports that the
|
||||||
|
source is not connectable but the associated error messages generated by
|
||||||
|
rsync or ssh are suppressed. You may want to use this option for sources,
|
||||||
|
like notebook PCs, that are often disconnected.
|
||||||
|
|
||||||
|
|
||||||
Hints
|
Hints
|
||||||
-----
|
-----
|
||||||
|
|
||||||
|
|
@ -587,7 +714,7 @@ host mx2.schottelius.org
|
||||||
Port 2342
|
Port 2342
|
||||||
--------------------------------------------------------------------------------
|
--------------------------------------------------------------------------------
|
||||||
|
|
||||||
If you only use that port for backup and normally want to use another port,
|
If you only use that port for backup only and normally want to use another port,
|
||||||
you can add 'HostName' and "HostKeyAlias" (if you also have different
|
you can add 'HostName' and "HostKeyAlias" (if you also have different
|
||||||
keys on the different ports):
|
keys on the different ports):
|
||||||
|
|
||||||
|
|
@ -604,17 +731,20 @@ Host bruehe
|
||||||
--------------------------------------------------------------------------------
|
--------------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
Using source names or interval in pre_/post_exec scripts
|
Using source names or interval in pre_/post_exec scripts
|
||||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
The pre-/post_exec scripts can access some internal variables from `ccollect`:
|
The pre-/post_exec scripts can access some internal variables from `ccollect`:
|
||||||
|
|
||||||
- INTERVAL: The interval specified on the command line
|
- 'INTERVAL': The interval specified on the command line
|
||||||
- no_sources: number of sources
|
- 'no_sources': number of sources
|
||||||
- source_$NUM: the name of the source
|
- 'source_$NUM': the name of the source
|
||||||
- name: the name of the currently being backuped source (not available for
|
- 'name': the name of the currently being backuped source (not available for
|
||||||
generic pre_exec script)
|
generic pre_exec script)
|
||||||
|
|
||||||
|
Only available for `post_exec`:
|
||||||
|
|
||||||
|
- 'remote_host': name of host we backup to (empty if unused)
|
||||||
|
|
||||||
|
|
||||||
Using rsync protocol without ssh
|
Using rsync protocol without ssh
|
||||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
@ -726,6 +856,44 @@ of different requirements, you can even omit creating
|
||||||
`/etc/ccollect/default/intervals/daily`.
|
`/etc/ccollect/default/intervals/daily`.
|
||||||
|
|
||||||
|
|
||||||
|
Comparing backups
|
||||||
|
~~~~~~~~~~~~~~~~~
|
||||||
|
If you want to see what changed between two backups, you can use
|
||||||
|
`rsync` directly:
|
||||||
|
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
[12:00] u0255:ddba034.netstream.ch# rsync -n -a --delete --stats --progress daily.20080324-0313.17841/ daily.20080325-0313.31148/
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
This results in a listing of changes. Because we pass -n to rsync no transfer
|
||||||
|
is made (i.e. report only mode).
|
||||||
|
|
||||||
|
This hint was reported by Daniel Aubry.
|
||||||
|
|
||||||
|
|
||||||
|
Testing for host reachabilty
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
If you want to test whether the host you try to backup is reachable, you can use
|
||||||
|
the following script as source specific pre-exec:
|
||||||
|
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
#!/bin/sh
|
||||||
|
# ping -c1 -q `cat "/etc/ccollect/sources/$name/source" | cut -d"@" -f2 | cut -d":" -f1`
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
|
||||||
|
This prevents the deletion of old backups, if the host is not reachable.
|
||||||
|
|
||||||
|
This hint was reported by Daniel Aubry.
|
||||||
|
|
||||||
|
|
||||||
|
Easy check for errors
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
If you want to see whether there have been any errors while doing the backup,
|
||||||
|
you can run `ccollect` together with `ccollect_analyse_logs.sh`:
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
$ ccollect | ccollect_analyse_logs.sh e
|
||||||
|
--------------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
F.A.Q.
|
F.A.Q.
|
||||||
------
|
------
|
||||||
|
|
||||||
|
|
@ -815,6 +983,31 @@ you can enter your password (have a look at screen(1), especially "C-a M"
|
||||||
and "C-a _", for more information).
|
and "C-a _", for more information).
|
||||||
|
|
||||||
|
|
||||||
|
Backup fails, if autofs is running, but sources not reachable
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
If you are trying to backup a system containing paths that are managed
|
||||||
|
by autofs, you may run into this error:
|
||||||
|
|
||||||
|
-------------------------------------------------------------------------------
|
||||||
|
2009-12-01-23:14:15: ccollect 0.8.1: Beginning backup using interval monatlich
|
||||||
|
[ikn] 2009-12-01-23:14:15: Beginning to backup
|
||||||
|
[ikn] 2009-12-01-23:14:15: Executing /home/users/nico/ethz/ccollect/sources/ikn/pre_exec ...
|
||||||
|
Enter LUKS passphrase:
|
||||||
|
[ikn] Command successful.
|
||||||
|
[ikn] key slot 0 unlocked.
|
||||||
|
[ikn] 2009-12-01-23:14:23: Finished /home/users/nico/ethz/ccollect/sources/ikn/pre_exec (return code 0). [ikn] directory has vanished: "/home/users/nico/privat/firmen/ethz/autofs/projects"
|
||||||
|
[ikn] directory has vanished: "/home/users/nico/privat/firmen/ethz/autofs/scratch"
|
||||||
|
[ikn] directory has vanished: "/home/users/nico/privat/firmen/ethz/autofs/sgscratch"
|
||||||
|
[ikn] directory has vanished: "/home/users/nico/privat/firmen/ethz/autofs/supp"
|
||||||
|
[ikn] directory has vanished: "/home/users/nico/privat/firmen/ethz/autofs/sysadmin"
|
||||||
|
[ikn] rsync warning: some files vanished before they could be transferred (code 24) at main.c(1057) [sender=3.0.6]
|
||||||
|
[ikn] 2009-12-01-23:44:23: Source / is not readable. Skipping.
|
||||||
|
-------------------------------------------------------------------------------
|
||||||
|
|
||||||
|
Thus, if you are unsure whether autofs paths can be mounted during backup,
|
||||||
|
stop autofs in pre_exec and reenable it in post_exec.
|
||||||
|
|
||||||
|
|
||||||
Examples
|
Examples
|
||||||
--------
|
--------
|
||||||
|
|
||||||
|
|
@ -853,12 +1046,12 @@ srwali01:~# cd /etc/ccollect/sources
|
||||||
srwali01:/etc/ccollect/sources# mkdir windos-wl6
|
srwali01:/etc/ccollect/sources# mkdir windos-wl6
|
||||||
srwali01:/etc/ccollect/sources# cd windos-wl6/
|
srwali01:/etc/ccollect/sources# cd windos-wl6/
|
||||||
srwali01:/etc/ccollect/sources/windos-wl6# echo /mnt/win/SYS/WL6 > source
|
srwali01:/etc/ccollect/sources/windos-wl6# echo /mnt/win/SYS/WL6 > source
|
||||||
srwali01:/etc/ccollect/sources/windos-wl6# ln -s /mnt/hdbackup/wl6 destination
|
srwali01:/etc/ccollect/sources/windos-wl6# echo /mnt/hdbackup/wl6 > destination
|
||||||
srwali01:/etc/ccollect/sources/windos-wl6# mkdir /mnt/hdbackup/wl6
|
srwali01:/etc/ccollect/sources/windos-wl6# mkdir /mnt/hdbackup/wl6
|
||||||
srwali01:/etc/ccollect/sources/windos-wl6# cd ..
|
srwali01:/etc/ccollect/sources/windos-wl6# cd ..
|
||||||
srwali01:/etc/ccollect/sources# mkdir windos-daten
|
srwali01:/etc/ccollect/sources# mkdir windos-daten
|
||||||
srwali01:/etc/ccollect/sources/windos-daten# echo /mnt/win/Daten > source
|
srwali01:/etc/ccollect/sources/windos-daten# echo /mnt/win/Daten > source
|
||||||
srwali01:/etc/ccollect/sources/windos-daten# ln -s /mnt/hdbackup/windos-daten destination
|
srwali01:/etc/ccollect/sources/windos-daten# echo /mnt/hdbackup/windos-daten > destination
|
||||||
srwali01:/etc/ccollect/sources/windos-daten# mkdir /mnt/hdbackup/windos-daten
|
srwali01:/etc/ccollect/sources/windos-daten# mkdir /mnt/hdbackup/windos-daten
|
||||||
|
|
||||||
# Now add some remote source
|
# Now add some remote source
|
||||||
|
|
@ -997,12 +1190,12 @@ rsync -av -H --delete /mnt/archiv/ "$DDIR/archiv/"
|
||||||
-------------------------------------------------------------------------
|
-------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
Processes running when doing ccollect -p
|
Processes running when doing ccollect -j
|
||||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
Truncated output from `ps axuwwwf`:
|
Truncated output from `ps axuwwwf`:
|
||||||
|
|
||||||
-------------------------------------------------------------------------
|
-------------------------------------------------------------------------
|
||||||
S+ 11:40 0:00 | | | \_ /bin/sh /usr/local/bin/ccollect.sh daily -p ddba034 ddba045 ddba046 ddba047 ddba049 ddna010 ddna011
|
S+ 11:40 0:00 | | | \_ /bin/sh /usr/local/bin/ccollect.sh daily -j ddba034 ddba045 ddba046 ddba047 ddba049 ddna010 ddna011
|
||||||
S+ 11:40 0:00 | | | \_ /bin/sh /usr/local/bin/ccollect.sh daily ddba034
|
S+ 11:40 0:00 | | | \_ /bin/sh /usr/local/bin/ccollect.sh daily ddba034
|
||||||
S+ 11:40 0:00 | | | | \_ /bin/sh /usr/local/bin/ccollect.sh daily ddba034
|
S+ 11:40 0:00 | | | | \_ /bin/sh /usr/local/bin/ccollect.sh daily ddba034
|
||||||
R+ 11:40 23:40 | | | | | \_ rsync -a --delete --numeric-ids --relative --delete-excluded --link-dest=/home/server/backup/ddba034
|
R+ 11:40 23:40 | | | | | \_ rsync -a --delete --numeric-ids --relative --delete-excluded --link-dest=/home/server/backup/ddba034
|
||||||
|
|
|
||||||
9
doc/changes/0.7.1
Normal file
9
doc/changes/0.7.1
Normal file
|
|
@ -0,0 +1,9 @@
|
||||||
|
* Added support for global delete_incomplete option
|
||||||
|
* Updated tools/ccollect_analyse_logs.sh: Added more error strings to find
|
||||||
|
* Removed use of 'basename': Replaced it with standard variables from cconf
|
||||||
|
* Updated documentation
|
||||||
|
* More hints
|
||||||
|
* Updated remote_host description
|
||||||
|
* Bugfix in shell artihmetic (Jeroen Bruijning)
|
||||||
|
* Bugfix: Allow "&" in sourcename (Reported by Tiziano Müller)
|
||||||
|
* Added ccollect_list_intervals.sh to list intervals with values
|
||||||
14
doc/changes/0.8
Normal file
14
doc/changes/0.8
Normal file
|
|
@ -0,0 +1,14 @@
|
||||||
|
* Introduce consistenst time sorting (John Lawless)
|
||||||
|
* Check for source connectivity before trying backup (John Lawless)
|
||||||
|
* Defensive programming patch (John Lawless)
|
||||||
|
* Some code cleanups (argument parsing, usage) (Nico Schottelius)
|
||||||
|
* Only consider directories as sources when using -a (Nico Schottelius)
|
||||||
|
* Fix general parsing problem with -a (Nico Schottelius)
|
||||||
|
* Fix potential bug when using remote_host, delete_incomplete and ssh (Nico Schottelius)
|
||||||
|
* Improve removal performance: minimised number of 'rm' calls (Nico Schottelius)
|
||||||
|
* Support sorting by mtime (John Lawless)
|
||||||
|
* Improve option handling (John Lawless)
|
||||||
|
* Add support for quiet operation for dead devices (quiet_if_down) (John Lawless)
|
||||||
|
* Add smart option parsing, including support for default values (John Lawless)
|
||||||
|
* Updated and cleaned up documentation (Nico Schottelius)
|
||||||
|
* Fixed bug "removal of current directory" in ccollect_delete_source.sh (Found by Günter Stöhr, fixed by Nico Schottelius)
|
||||||
16
doc/changes/2.0
Normal file
16
doc/changes/2.0
Normal file
|
|
@ -0,0 +1,16 @@
|
||||||
|
* Introduce -j option for max parallel jobs, deprecate -p (Darko Poljak)
|
||||||
|
* Add locking (Darko Poljak)
|
||||||
|
* Fix source-is-up check (Nikita Koshikov)
|
||||||
|
* Fix some minor command line parsing issues (Nico Schottelius)
|
||||||
|
* Correct output, if configuration is not in cconfig format (Nico Schottelius)
|
||||||
|
* Minor code cleanups and optimisations (Nico Schottelius)
|
||||||
|
* ccollect_analyse_logs.sh traps more errors and warnings (Patrick Drolet)
|
||||||
|
* Remove -v for mkdir and rm, as they are not POSIX (Patrick Drolet)
|
||||||
|
* Export destination_* to source specific post_exec (Nico Schottelius)
|
||||||
|
* Update documentation regarding exported variables (Nico Schottelius)
|
||||||
|
* Simplify time calculation (Nico Schottelius)
|
||||||
|
* Documentate pre_exec error handling (Nico Schottelius)
|
||||||
|
* Added start script (Thorsten Elle)
|
||||||
|
* Documentate autofs hint (Nico Schottelius)
|
||||||
|
* Speedup source-is-up check and remove --archive (Nico Schottelius)
|
||||||
|
* Removed support for remote backup (see doc) (Nico Schottelius)
|
||||||
1
doc/changes/2.1
Normal file
1
doc/changes/2.1
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
* Add options for stdout, file and syslog logging (Darko Poljak)
|
||||||
1
doc/changes/2.10
Normal file
1
doc/changes/2.10
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
* Add 'current' symlink to backup destinations (Steffen Zieger)
|
||||||
1
doc/changes/2.2
Normal file
1
doc/changes/2.2
Normal file
|
|
@ -0,0 +1 @@
|
||||||
|
* Bugfix: empty rsync_options line causes destroying source (Darko Poljak)
|
||||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue