Discussion:
[Duplicity-talk] why restart takes so long
c***@ccs.covici.com
2016-05-10 12:02:01 UTC
Permalink
Hi. I had to restart my full backup and its going through every file it
had and it is taking 18 hours and its not done yet. I see a loop in
there and I wonder if there is any way to shorten the time or get rid of
the loop altoggether?

Thanks in advance for any suggestions.
--
Your life is like a penny. You're going to lose it. The question is:
How do
you spend it?

John Covici
***@ccs.covici.com
Kenneth Loafman
2016-05-10 14:32:50 UTC
Permalink
No, there is no way to get rid of the loop, whichever one you're talking
about, there are a lot of them.

The main suggestion I have for anyone with multi-hour backups is to:

1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar tool.

The second suggestion I have is to chunk the backup into reasonably sized
portions, 200-500GB each.

...Ken
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going through every file it
had and it is taking 18 hours and its not done yet. I see a loop in
there and I wonder if there is any way to shorten the time or get rid of
the loop altoggether?
Thanks in advance for any suggestions.
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
c***@ccs.covici.com
2016-05-10 15:48:23 UTC
Permalink
But the question is why does duplicity need to do that at all?
Couldn't it go through the last hundred or so or some reasonable number?
Also, what about the original error which caused the restart in the
first place? What is up with that that?
Post by Kenneth Loafman
No, there is no way to get rid of the loop, whichever one you're talking
about, there are a lot of them.
1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar tool.
The second suggestion I have is to chunk the backup into reasonably sized
portions, 200-500GB each.
...Ken
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going through every file it
had and it is taking 18 hours and its not done yet. I see a loop in
there and I wonder if there is any way to shorten the time or get rid of
the loop altoggether?
Thanks in advance for any suggestions.
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question is:
How do
you spend it?

John Covici
***@ccs.covici.com
Kenneth Loafman
2016-05-10 16:18:51 UTC
Permalink
It sounds like the backup restarted completely, rather than at the last
known spot. What do the first 50 or so lines of the log contain?

...Ken
Post by c***@ccs.covici.com
But the question is why does duplicity need to do that at all?
Couldn't it go through the last hundred or so or some reasonable number?
Also, what about the original error which caused the restart in the
first place? What is up with that that?
Post by Kenneth Loafman
No, there is no way to get rid of the loop, whichever one you're talking
about, there are a lot of them.
1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar tool.
The second suggestion I have is to chunk the backup into reasonably sized
portions, 200-500GB each.
...Ken
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going through every file
it
Post by Kenneth Loafman
Post by c***@ccs.covici.com
had and it is taking 18 hours and its not done yet. I see a loop in
there and I wonder if there is any way to shorten the time or get rid
of
Post by Kenneth Loafman
Post by c***@ccs.covici.com
the loop altoggether?
Thanks in advance for any suggestions.
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
c***@ccs.covici.com
2016-05-10 17:02:58 UTC
Permalink
Well, what happens is that it says A and all the filenames, but no
reference to a tar file, so its not really backing up again, this is why
I was wondering what it is doing. The part of the program which seems
to be involved is this:
def restart_position_iterator(tarblock_iter):
"""
Fake writing to backend, but do go through all the source paths.
Stop when we have processed the last file and block from the
last backup. Normal backup will proceed at the start of
the
next volume in the set.

@type tarblock_iter: tarblock_iter
@param tarblock_iter: iterator for current tar block

@rtype: int
@return: constant 0 (zero)
"""
last_index = globals.restart.last_index
last_block = globals.restart.last_block
try:
# Just spin our wheels
iter_result =
tarblock_iter.next()
while iter_result:
if (tarblock_iter.previous_index ==
last_index):
# If both the previous index and this index are
done, exit now
# before we hit the next index, to prevent
skipping its first
# block.
if not last_block and not
tarblock_iter.previous_block:
break
# Only check block number if
last_block is also a number
if last_block and tarblock_iter.previous_block >
last_block:
break
if tarblock_iter.previous_index
log.Warn(_("File %s complete in backup set.\n"
"Continuing restart
on file %s.") %
Post by Kenneth Loafman
It sounds like the backup restarted completely, rather than at the last
known spot. What do the first 50 or so lines of the log contain?
...Ken
Post by c***@ccs.covici.com
But the question is why does duplicity need to do that at all?
Couldn't it go through the last hundred or so or some reasonable number?
Also, what about the original error which caused the restart in the
first place? What is up with that that?
Post by Kenneth Loafman
No, there is no way to get rid of the loop, whichever one you're talking
about, there are a lot of them.
1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar tool.
The second suggestion I have is to chunk the backup into reasonably sized
portions, 200-500GB each.
...Ken
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going through every file
it
Post by Kenneth Loafman
Post by c***@ccs.covici.com
had and it is taking 18 hours and its not done yet. I see a loop in
there and I wonder if there is any way to shorten the time or get rid
of
Post by Kenneth Loafman
Post by c***@ccs.covici.com
the loop altoggether?
Thanks in advance for any suggestions.
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question is:
How do
you spend it?

John Covici
***@ccs.covici.com
Kenneth Loafman
2016-05-10 18:16:24 UTC
Permalink
Do you have a lot of very small files? This should not take too long.
Post by c***@ccs.covici.com
Well, what happens is that it says A and all the filenames, but no
reference to a tar file, so its not really backing up again, this is why
I was wondering what it is doing. The part of the program which seems
"""
Fake writing to backend, but do go through all the source paths.
Stop when we have processed the last file and block from the
last backup. Normal backup will proceed at the start of
the
next volume in the set.
@type tarblock_iter: tarblock_iter
@param tarblock_iter: iterator for current tar block
@rtype: int
@return: constant 0 (zero)
"""
last_index = globals.restart.last_index
last_block = globals.restart.last_block
# Just spin our wheels
iter_result =
tarblock_iter.next()
if (tarblock_iter.previous_index ==
# If both the previous index and this index are
done, exit now
# before we hit the next index, to prevent
skipping its first
# block.
if not last_block and not
break
# Only check block number if
last_block is also a number
if last_block and tarblock_iter.previous_block >
break
if
tarblock_iter.previous_index
log.Warn(_("File %s complete in backup set.\n"
"Continuing restart
on file %s.") %
Post by Kenneth Loafman
It sounds like the backup restarted completely, rather than at the last
known spot. What do the first 50 or so lines of the log contain?
...Ken
Post by c***@ccs.covici.com
But the question is why does duplicity need to do that at all?
Couldn't it go through the last hundred or so or some reasonable
number?
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Also, what about the original error which caused the restart in the
first place? What is up with that that?
Post by Kenneth Loafman
No, there is no way to get rid of the loop, whichever one you're
talking
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
about, there are a lot of them.
1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar tool.
The second suggestion I have is to chunk the backup into reasonably
sized
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
portions, 200-500GB each.
...Ken
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going through every
file
Post by Kenneth Loafman
Post by c***@ccs.covici.com
it
Post by Kenneth Loafman
Post by c***@ccs.covici.com
had and it is taking 18 hours and its not done yet. I see a loop
in
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
there and I wonder if there is any way to shorten the time or get
rid
Post by Kenneth Loafman
Post by c***@ccs.covici.com
of
Post by Kenneth Loafman
Post by c***@ccs.covici.com
the loop altoggether?
Thanks in advance for any suggestions.
--
Your life is like a penny. You're going to lose it. The question
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
c***@ccs.covici.com
2016-05-10 19:19:45 UTC
Permalink
I have thousands of files, some large and some small, its about a
terabytes worth. It stopped pretty near the end and it just stopped
again, so my emphasis wuld be to fix the error and maybe do something
about optimizing the program.
Post by Kenneth Loafman
Do you have a lot of very small files? This should not take too long.
Post by c***@ccs.covici.com
Well, what happens is that it says A and all the filenames, but no
reference to a tar file, so its not really backing up again, this is why
I was wondering what it is doing. The part of the program which seems
"""
Fake writing to backend, but do go through all the source paths.
Stop when we have processed the last file and block from the
last backup. Normal backup will proceed at the start of
the
next volume in the set.
@type tarblock_iter: tarblock_iter
@param tarblock_iter: iterator for current tar block
@rtype: int
@return: constant 0 (zero)
"""
last_index = globals.restart.last_index
last_block = globals.restart.last_block
# Just spin our wheels
iter_result =
tarblock_iter.next()
if (tarblock_iter.previous_index ==
# If both the previous index and this index are
done, exit now
# before we hit the next index, to prevent
skipping its first
# block.
if not last_block and not
break
# Only check block number if
last_block is also a number
if last_block and tarblock_iter.previous_block >
break
if
tarblock_iter.previous_index
log.Warn(_("File %s complete in backup set.\n"
"Continuing restart
on file %s.") %
Post by Kenneth Loafman
It sounds like the backup restarted completely, rather than at the last
known spot. What do the first 50 or so lines of the log contain?
...Ken
Post by c***@ccs.covici.com
But the question is why does duplicity need to do that at all?
Couldn't it go through the last hundred or so or some reasonable
number?
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Also, what about the original error which caused the restart in the
first place? What is up with that that?
Post by Kenneth Loafman
No, there is no way to get rid of the loop, whichever one you're
talking
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
about, there are a lot of them.
1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar tool.
The second suggestion I have is to chunk the backup into reasonably
sized
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
portions, 200-500GB each.
...Ken
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going through every
file
Post by Kenneth Loafman
Post by c***@ccs.covici.com
it
Post by Kenneth Loafman
Post by c***@ccs.covici.com
had and it is taking 18 hours and its not done yet. I see a loop
in
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
there and I wonder if there is any way to shorten the time or get
rid
Post by Kenneth Loafman
Post by c***@ccs.covici.com
of
Post by Kenneth Loafman
Post by c***@ccs.covici.com
the loop altoggether?
Thanks in advance for any suggestions.
--
Your life is like a penny. You're going to lose it. The question
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question is:
How do
you spend it?

John Covici
***@ccs.covici.com
Kenneth Loafman
2016-05-11 14:33:21 UTC
Permalink
One quick question about the format of the traceback you show... is that
because of the email client you use, or does it show up that way on the
console?

I'd like to see the block of data at the first of the log, say the first
200 lines. Please dump in raw text format.
Post by c***@ccs.covici.com
I have thousands of files, some large and some small, its about a
terabytes worth. It stopped pretty near the end and it just stopped
again, so my emphasis wuld be to fix the error and maybe do something
about optimizing the program.
Post by Kenneth Loafman
Do you have a lot of very small files? This should not take too long.
Post by c***@ccs.covici.com
Well, what happens is that it says A and all the filenames, but no
reference to a tar file, so its not really backing up again, this is
why
Post by Kenneth Loafman
Post by c***@ccs.covici.com
I was wondering what it is doing. The part of the program which seems
"""
Fake writing to backend, but do go through all the source
paths.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Stop when we have processed the last file and block
from the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last backup. Normal backup will proceed at the start of
the
next volume in the set.
@type tarblock_iter: tarblock_iter
@param tarblock_iter: iterator for current tar block
@rtype: int
@return: constant 0 (zero)
"""
last_index = globals.restart.last_index
last_block = globals.restart.last_block
# Just spin our wheels
iter_result =
tarblock_iter.next()
if (tarblock_iter.previous_index
==
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# If both the previous index and this index
are
Post by Kenneth Loafman
Post by c***@ccs.covici.com
done, exit now
# before we hit the next index, to prevent
skipping its first
# block.
if not last_block
and
Post by Kenneth Loafman
Post by c***@ccs.covici.com
not
break
# Only check
block number if
last_block is also a number
if last_block and
tarblock_iter.previous_block >
Post by Kenneth Loafman
Post by c***@ccs.covici.com
break
if
tarblock_iter.previous_index
log.Warn(_("File %s complete in backup
set.\n"
"Continuing
Post by Kenneth Loafman
Post by c***@ccs.covici.com
restart
on file %s.") %
Post by Kenneth Loafman
It sounds like the backup restarted completely, rather than at the
last
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
known spot. What do the first 50 or so lines of the log contain?
...Ken
Post by c***@ccs.covici.com
But the question is why does duplicity need to do that at all?
Couldn't it go through the last hundred or so or some reasonable
number?
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Also, what about the original error which caused the restart in the
first place? What is up with that that?
Post by Kenneth Loafman
No, there is no way to get rid of the loop, whichever one you're
talking
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
about, there are a lot of them.
The main suggestion I have for anyone with multi-hour backups is
1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar tool.
The second suggestion I have is to chunk the backup into
reasonably
Post by Kenneth Loafman
Post by c***@ccs.covici.com
sized
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
portions, 200-500GB each.
...Ken
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going through
every
Post by Kenneth Loafman
Post by c***@ccs.covici.com
file
Post by Kenneth Loafman
Post by c***@ccs.covici.com
it
Post by Kenneth Loafman
Post by c***@ccs.covici.com
had and it is taking 18 hours and its not done yet. I see a
loop
Post by Kenneth Loafman
Post by c***@ccs.covici.com
in
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
there and I wonder if there is any way to shorten the time or
get
Post by Kenneth Loafman
Post by c***@ccs.covici.com
rid
Post by Kenneth Loafman
Post by c***@ccs.covici.com
of
Post by Kenneth Loafman
Post by c***@ccs.covici.com
the loop altoggether?
Thanks in advance for any suggestions.
--
Your life is like a penny. You're going to lose it. The
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
c***@ccs.covici.com
2016-05-11 16:25:11 UTC
Permalink
Well, here are the first few lines from the log when I did the restart.
Since I can't see my own posts, I can't answer the question about the
traceback.

Using archive dir: /root/.cache/duplicity/backup_linux
Using backup name: backup_linux
Import of duplicity.backends.acdclibackend Succeeded
Import of duplicity.backends.azurebackend Succeeded
Import of duplicity.backends.b2backend Succeeded
Import of duplicity.backends.botobackend Succeeded
Import of duplicity.backends.cfbackend Succeeded
Import of duplicity.backends.copycombackend Succeeded
Import of duplicity.backends.dpbxbackend Failed: No module named dropbox
Import of duplicity.backends.gdocsbackend Succeeded
Import of duplicity.backends.giobackend Succeeded
Import of duplicity.backends.hsibackend Succeeded
Import of duplicity.backends.hubicbackend Succeeded
Import of duplicity.backends.imapbackend Succeeded
Import of duplicity.backends.lftpbackend Succeeded
Import of duplicity.backends.localbackend Succeeded
Import of duplicity.backends.mediafirebackend Succeeded
Import of duplicity.backends.megabackend Succeeded
Import of duplicity.backends.multibackend Succeeded
Import of duplicity.backends.ncftpbackend Succeeded
Import of duplicity.backends.onedrivebackend Succeeded
Import of duplicity.backends.par2backend Succeeded
Import of duplicity.backends.pydrivebackend Succeeded
Import of duplicity.backends.rsyncbackend Succeeded
Import of duplicity.backends.ssh_paramiko_backend Succeeded
Import of duplicity.backends.ssh_pexpect_backend Succeeded
Import of duplicity.backends.swiftbackend Succeeded
Import of duplicity.backends.sxbackend Succeeded
Import of duplicity.backends.tahoebackend Succeeded
Import of duplicity.backends.webdavbackend Succeeded
Reading globbing filelist /etc/azure_excludes.txt
Main action: inc
================================================================================
duplicity 0.7.07 (April 10, 2016)
Args: /usr/bin/duplicity -v info --volsize 300 --gpg-options
--pinentry-mode loopback --exclude-filelist /etc/azure_excludes.txt
--name backup_linux --encrypt-key ***@ccs.covici.com / azure://linux
Linux ccs.covici.com 4.1.17-gentoo #1 SMP PREEMPT Mon Feb 15 15:05:32
EST 2016 x86_64 Intel(R) Core(TM) i7-2600 CPU @ 3.40GHz
/usr/lib/python-exec/python2.7/python2 2.7.11 (default, Jan 29 2016,
22:32:51)
[GCC 4.9.3]
================================================================================
Using temporary directory /tmp/duplicity-K67URk-tempdir
Temp has 37554356224 available, backup will use approx 408944640.
Local and Remote metadata are synchronized, no sync needed.
Last full backup left a partial set, restarting.
Last full backup date: Fri May 6 16:24:16 2016
RESTART: Volumes 45955 to 45955 failed to upload before termination.
Restarting backup at volume 45955.
Deleting /tmp/duplicity-K67URk-tempdir/mktemp-gLMnDl-2
Restarting after volume 45954, file
var/www/covici.com/htdocs-secure/owncloud/data/mattguice/files/20151114Town
Hall Meeting (Choral Prelude).mp3, block 977
A .
A audio
A audio/.mp3crc
A audio/0015.ps

And so on and so forth.
Post by Kenneth Loafman
One quick question about the format of the traceback you show... is that
because of the email client you use, or does it show up that way on the
console?
I'd like to see the block of data at the first of the log, say the first
200 lines. Please dump in raw text format.
Post by c***@ccs.covici.com
I have thousands of files, some large and some small, its about a
terabytes worth. It stopped pretty near the end and it just stopped
again, so my emphasis wuld be to fix the error and maybe do something
about optimizing the program.
Post by Kenneth Loafman
Do you have a lot of very small files? This should not take too long.
Post by c***@ccs.covici.com
Well, what happens is that it says A and all the filenames, but no
reference to a tar file, so its not really backing up again, this is
why
Post by Kenneth Loafman
Post by c***@ccs.covici.com
I was wondering what it is doing. The part of the program which seems
"""
Fake writing to backend, but do go through all the source
paths.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Stop when we have processed the last file and block
from the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last backup. Normal backup will proceed at the start of
the
next volume in the set.
@type tarblock_iter: tarblock_iter
@param tarblock_iter: iterator for current tar block
@rtype: int
@return: constant 0 (zero)
"""
last_index = globals.restart.last_index
last_block = globals.restart.last_block
# Just spin our wheels
iter_result =
tarblock_iter.next()
if (tarblock_iter.previous_index
==
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# If both the previous index and this index
are
Post by Kenneth Loafman
Post by c***@ccs.covici.com
done, exit now
# before we hit the next index, to prevent
skipping its first
# block.
if not last_block
and
Post by Kenneth Loafman
Post by c***@ccs.covici.com
not
break
# Only check
block number if
last_block is also a number
if last_block and
tarblock_iter.previous_block >
Post by Kenneth Loafman
Post by c***@ccs.covici.com
break
if
tarblock_iter.previous_index
log.Warn(_("File %s complete in backup
set.\n"
"Continuing
Post by Kenneth Loafman
Post by c***@ccs.covici.com
restart
on file %s.") %
Post by Kenneth Loafman
It sounds like the backup restarted completely, rather than at the
last
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
known spot. What do the first 50 or so lines of the log contain?
...Ken
Post by c***@ccs.covici.com
But the question is why does duplicity need to do that at all?
Couldn't it go through the last hundred or so or some reasonable
number?
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Also, what about the original error which caused the restart in the
first place? What is up with that that?
Post by Kenneth Loafman
No, there is no way to get rid of the loop, whichever one you're
talking
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
about, there are a lot of them.
The main suggestion I have for anyone with multi-hour backups is
1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar tool.
The second suggestion I have is to chunk the backup into
reasonably
Post by Kenneth Loafman
Post by c***@ccs.covici.com
sized
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
portions, 200-500GB each.
...Ken
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going through
every
Post by Kenneth Loafman
Post by c***@ccs.covici.com
file
Post by Kenneth Loafman
Post by c***@ccs.covici.com
it
Post by Kenneth Loafman
Post by c***@ccs.covici.com
had and it is taking 18 hours and its not done yet. I see a
loop
Post by Kenneth Loafman
Post by c***@ccs.covici.com
in
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
there and I wonder if there is any way to shorten the time or
get
Post by Kenneth Loafman
Post by c***@ccs.covici.com
rid
Post by Kenneth Loafman
Post by c***@ccs.covici.com
of
Post by Kenneth Loafman
Post by c***@ccs.covici.com
the loop altoggether?
Thanks in advance for any suggestions.
--
Your life is like a penny. You're going to lose it. The
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question is:
How do
you spend it?

John Covici
***@ccs.covici.com
Scott McKenzie
2016-05-11 21:12:09 UTC
Permalink
Hi covici

I am not sure why the restart is taking so long, but I found a bug in the
azurebackend that was causing it to tail occasionally. Instead of retrying
a failed upload it would cause duplicity to fail the backup. You can find
the fix here:
https://code.launchpad.net/~noizyland/duplicity/fix_azurebackend_typo

Please test it and let me know if it helps.


-Scott
Post by c***@ccs.covici.com
Well, here are the first few lines from the log when I did the restart.
Since I can't see my own posts, I can't answer the question about the
traceback.
Using archive dir: /root/.cache/duplicity/backup_linux
Using backup name: backup_linux
Import of duplicity.backends.acdclibackend Succeeded
Import of duplicity.backends.azurebackend Succeeded
Import of duplicity.backends.b2backend Succeeded
Import of duplicity.backends.botobackend Succeeded
Import of duplicity.backends.cfbackend Succeeded
Import of duplicity.backends.copycombackend Succeeded
Import of duplicity.backends.dpbxbackend Failed: No module named dropbox
Import of duplicity.backends.gdocsbackend Succeeded
Import of duplicity.backends.giobackend Succeeded
Import of duplicity.backends.hsibackend Succeeded
Import of duplicity.backends.hubicbackend Succeeded
Import of duplicity.backends.imapbackend Succeeded
Import of duplicity.backends.lftpbackend Succeeded
Import of duplicity.backends.localbackend Succeeded
Import of duplicity.backends.mediafirebackend Succeeded
Import of duplicity.backends.megabackend Succeeded
Import of duplicity.backends.multibackend Succeeded
Import of duplicity.backends.ncftpbackend Succeeded
Import of duplicity.backends.onedrivebackend Succeeded
Import of duplicity.backends.par2backend Succeeded
Import of duplicity.backends.pydrivebackend Succeeded
Import of duplicity.backends.rsyncbackend Succeeded
Import of duplicity.backends.ssh_paramiko_backend Succeeded
Import of duplicity.backends.ssh_pexpect_backend Succeeded
Import of duplicity.backends.swiftbackend Succeeded
Import of duplicity.backends.sxbackend Succeeded
Import of duplicity.backends.tahoebackend Succeeded
Import of duplicity.backends.webdavbackend Succeeded
Reading globbing filelist /etc/azure_excludes.txt
Main action: inc
================================================================================
duplicity 0.7.07 (April 10, 2016)
Args: /usr/bin/duplicity -v info --volsize 300 --gpg-options
--pinentry-mode loopback --exclude-filelist /etc/azure_excludes.txt
Linux ccs.covici.com 4.1.17-gentoo #1 SMP PREEMPT Mon Feb 15 15:05:32
/usr/lib/python-exec/python2.7/python2 2.7.11 (default, Jan 29 2016,
22:32:51)
[GCC 4.9.3]
================================================================================
Using temporary directory /tmp/duplicity-K67URk-tempdir
Temp has 37554356224 available, backup will use approx 408944640.
Local and Remote metadata are synchronized, no sync needed.
Last full backup left a partial set, restarting.
Last full backup date: Fri May 6 16:24:16 2016
RESTART: Volumes 45955 to 45955 failed to upload before termination.
Restarting backup at volume 45955.
Deleting /tmp/duplicity-K67URk-tempdir/mktemp-gLMnDl-2
Restarting after volume 45954, file
var/www/
covici.com/htdocs-secure/owncloud/data/mattguice/files/20151114Town
Hall Meeting (Choral Prelude).mp3, block 977
A .
A audio
A audio/.mp3crc
A audio/0015.ps
And so on and so forth.
Post by Kenneth Loafman
One quick question about the format of the traceback you show... is that
because of the email client you use, or does it show up that way on the
console?
I'd like to see the block of data at the first of the log, say the first
200 lines. Please dump in raw text format.
Post by c***@ccs.covici.com
I have thousands of files, some large and some small, its about a
terabytes worth. It stopped pretty near the end and it just stopped
again, so my emphasis wuld be to fix the error and maybe do something
about optimizing the program.
Post by Kenneth Loafman
Do you have a lot of very small files? This should not take too
long.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Well, what happens is that it says A and all the filenames, but no
reference to a tar file, so its not really backing up again, this
is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
why
Post by Kenneth Loafman
Post by c***@ccs.covici.com
I was wondering what it is doing. The part of the program which
seems
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
"""
Fake writing to backend, but do go through all the source
paths.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Stop when we have processed the last file and block
from the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last backup. Normal backup will proceed at
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
start of
the
next volume in the set.
@type tarblock_iter: tarblock_iter
@param tarblock_iter: iterator for current tar block
@rtype: int
@return: constant 0 (zero)
"""
last_index = globals.restart.last_index
last_block =
globals.restart.last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# Just spin our wheels
iter_result =
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
tarblock_iter.next()
if
(tarblock_iter.previous_index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
==
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# If both the previous index and this
index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
are
Post by Kenneth Loafman
Post by c***@ccs.covici.com
done, exit now
# before we hit the next index, to
prevent
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
skipping its first
# block.
if not
last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
and
Post by Kenneth Loafman
Post by c***@ccs.covici.com
not
break
# Only
check
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
block number if
last_block is also a number
if last_block and
tarblock_iter.previous_block >
Post by Kenneth Loafman
Post by c***@ccs.covici.com
break
if
tarblock_iter.previous_index
log.Warn(_("File %s complete in backup
set.\n"
"Continuing
Post by Kenneth Loafman
Post by c***@ccs.covici.com
restart
on file %s.") %
Post by Kenneth Loafman
It sounds like the backup restarted completely, rather than at
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
known spot. What do the first 50 or so lines of the log contain?
...Ken
Post by c***@ccs.covici.com
But the question is why does duplicity need to do that at all?
Couldn't it go through the last hundred or so or some
reasonable
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
number?
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Also, what about the original error which caused the restart
in the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
first place? What is up with that that?
Post by Kenneth Loafman
No, there is no way to get rid of the loop, whichever one
you're
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
talking
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
about, there are a lot of them.
The main suggestion I have for anyone with multi-hour
backups is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar tool.
The second suggestion I have is to chunk the backup into
reasonably
Post by Kenneth Loafman
Post by c***@ccs.covici.com
sized
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
portions, 200-500GB each.
...Ken
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going through
every
Post by Kenneth Loafman
Post by c***@ccs.covici.com
file
Post by Kenneth Loafman
Post by c***@ccs.covici.com
it
Post by Kenneth Loafman
Post by c***@ccs.covici.com
had and it is taking 18 hours and its not done yet. I see
a
Post by Kenneth Loafman
Post by c***@ccs.covici.com
loop
Post by Kenneth Loafman
Post by c***@ccs.covici.com
in
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
there and I wonder if there is any way to shorten the time
or
Post by Kenneth Loafman
Post by c***@ccs.covici.com
get
Post by Kenneth Loafman
Post by c***@ccs.covici.com
rid
Post by Kenneth Loafman
Post by c***@ccs.covici.com
of
Post by Kenneth Loafman
Post by c***@ccs.covici.com
the loop altoggether?
Thanks in advance for any suggestions.
--
Your life is like a penny. You're going to lose it. The
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
c***@ccs.covici.com
2016-05-11 21:32:18 UTC
Permalink
Thanks, I will definitely try this.
Post by Scott McKenzie
Hi covici
I am not sure why the restart is taking so long, but I found a bug in the
azurebackend that was causing it to tail occasionally. Instead of retrying
a failed upload it would cause duplicity to fail the backup. You can find
https://code.launchpad.net/~noizyland/duplicity/fix_azurebackend_typo
Please test it and let me know if it helps.
-Scott
Post by c***@ccs.covici.com
Well, here are the first few lines from the log when I did the restart.
Since I can't see my own posts, I can't answer the question about the
traceback.
Using archive dir: /root/.cache/duplicity/backup_linux
Using backup name: backup_linux
Import of duplicity.backends.acdclibackend Succeeded
Import of duplicity.backends.azurebackend Succeeded
Import of duplicity.backends.b2backend Succeeded
Import of duplicity.backends.botobackend Succeeded
Import of duplicity.backends.cfbackend Succeeded
Import of duplicity.backends.copycombackend Succeeded
Import of duplicity.backends.dpbxbackend Failed: No module named dropbox
Import of duplicity.backends.gdocsbackend Succeeded
Import of duplicity.backends.giobackend Succeeded
Import of duplicity.backends.hsibackend Succeeded
Import of duplicity.backends.hubicbackend Succeeded
Import of duplicity.backends.imapbackend Succeeded
Import of duplicity.backends.lftpbackend Succeeded
Import of duplicity.backends.localbackend Succeeded
Import of duplicity.backends.mediafirebackend Succeeded
Import of duplicity.backends.megabackend Succeeded
Import of duplicity.backends.multibackend Succeeded
Import of duplicity.backends.ncftpbackend Succeeded
Import of duplicity.backends.onedrivebackend Succeeded
Import of duplicity.backends.par2backend Succeeded
Import of duplicity.backends.pydrivebackend Succeeded
Import of duplicity.backends.rsyncbackend Succeeded
Import of duplicity.backends.ssh_paramiko_backend Succeeded
Import of duplicity.backends.ssh_pexpect_backend Succeeded
Import of duplicity.backends.swiftbackend Succeeded
Import of duplicity.backends.sxbackend Succeeded
Import of duplicity.backends.tahoebackend Succeeded
Import of duplicity.backends.webdavbackend Succeeded
Reading globbing filelist /etc/azure_excludes.txt
Main action: inc
================================================================================
duplicity 0.7.07 (April 10, 2016)
Args: /usr/bin/duplicity -v info --volsize 300 --gpg-options
--pinentry-mode loopback --exclude-filelist /etc/azure_excludes.txt
Linux ccs.covici.com 4.1.17-gentoo #1 SMP PREEMPT Mon Feb 15 15:05:32
/usr/lib/python-exec/python2.7/python2 2.7.11 (default, Jan 29 2016,
22:32:51)
[GCC 4.9.3]
================================================================================
Using temporary directory /tmp/duplicity-K67URk-tempdir
Temp has 37554356224 available, backup will use approx 408944640.
Local and Remote metadata are synchronized, no sync needed.
Last full backup left a partial set, restarting.
Last full backup date: Fri May 6 16:24:16 2016
RESTART: Volumes 45955 to 45955 failed to upload before termination.
Restarting backup at volume 45955.
Deleting /tmp/duplicity-K67URk-tempdir/mktemp-gLMnDl-2
Restarting after volume 45954, file
var/www/
covici.com/htdocs-secure/owncloud/data/mattguice/files/20151114Town
Hall Meeting (Choral Prelude).mp3, block 977
A .
A audio
A audio/.mp3crc
A audio/0015.ps
And so on and so forth.
Post by Kenneth Loafman
One quick question about the format of the traceback you show... is that
because of the email client you use, or does it show up that way on the
console?
I'd like to see the block of data at the first of the log, say the first
200 lines. Please dump in raw text format.
Post by c***@ccs.covici.com
I have thousands of files, some large and some small, its about a
terabytes worth. It stopped pretty near the end and it just stopped
again, so my emphasis wuld be to fix the error and maybe do something
about optimizing the program.
Post by Kenneth Loafman
Do you have a lot of very small files? This should not take too
long.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Well, what happens is that it says A and all the filenames, but no
reference to a tar file, so its not really backing up again, this
is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
why
Post by Kenneth Loafman
Post by c***@ccs.covici.com
I was wondering what it is doing. The part of the program which
seems
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
"""
Fake writing to backend, but do go through all the source
paths.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Stop when we have processed the last file and block
from the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last backup. Normal backup will proceed at
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
start of
the
next volume in the set.
@type tarblock_iter: tarblock_iter
@param tarblock_iter: iterator for current tar block
@rtype: int
@return: constant 0 (zero)
"""
last_index = globals.restart.last_index
last_block =
globals.restart.last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# Just spin our wheels
iter_result =
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
tarblock_iter.next()
if
(tarblock_iter.previous_index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
==
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# If both the previous index and this
index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
are
Post by Kenneth Loafman
Post by c***@ccs.covici.com
done, exit now
# before we hit the next index, to
prevent
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
skipping its first
# block.
if not
last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
and
Post by Kenneth Loafman
Post by c***@ccs.covici.com
not
break
# Only
check
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
block number if
last_block is also a number
if last_block and
tarblock_iter.previous_block >
Post by Kenneth Loafman
Post by c***@ccs.covici.com
break
if
tarblock_iter.previous_index
log.Warn(_("File %s complete in backup
set.\n"
"Continuing
Post by Kenneth Loafman
Post by c***@ccs.covici.com
restart
on file %s.") %
Post by Kenneth Loafman
It sounds like the backup restarted completely, rather than at
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
known spot. What do the first 50 or so lines of the log contain?
...Ken
Post by c***@ccs.covici.com
But the question is why does duplicity need to do that at all?
Couldn't it go through the last hundred or so or some
reasonable
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
number?
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Also, what about the original error which caused the restart
in the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
first place? What is up with that that?
Post by Kenneth Loafman
No, there is no way to get rid of the loop, whichever one
you're
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
talking
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
about, there are a lot of them.
The main suggestion I have for anyone with multi-hour
backups is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar tool.
The second suggestion I have is to chunk the backup into
reasonably
Post by Kenneth Loafman
Post by c***@ccs.covici.com
sized
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
portions, 200-500GB each.
...Ken
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going through
every
Post by Kenneth Loafman
Post by c***@ccs.covici.com
file
Post by Kenneth Loafman
Post by c***@ccs.covici.com
it
Post by Kenneth Loafman
Post by c***@ccs.covici.com
had and it is taking 18 hours and its not done yet. I see
a
Post by Kenneth Loafman
Post by c***@ccs.covici.com
loop
Post by Kenneth Loafman
Post by c***@ccs.covici.com
in
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
there and I wonder if there is any way to shorten the time
or
Post by Kenneth Loafman
Post by c***@ccs.covici.com
get
Post by Kenneth Loafman
Post by c***@ccs.covici.com
rid
Post by Kenneth Loafman
Post by c***@ccs.covici.com
of
Post by Kenneth Loafman
Post by c***@ccs.covici.com
the loop altoggether?
Thanks in advance for any suggestions.
--
Your life is like a penny. You're going to lose it. The
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question is:
How do
you spend it?

John Covici
***@ccs.covici.com
c***@ccs.covici.com
2016-05-11 21:40:26 UTC
Permalink
How do I get this branch, I have installed this on gentoo using a tar
archiv in the download directory. Can you give me the patch, so I can
fix by hand?

Also, I wonder why an upload should fail at all since there is no space
limit?

Thanks much.
Post by Scott McKenzie
Hi covici
I am not sure why the restart is taking so long, but I found a bug in the
azurebackend that was causing it to tail occasionally. Instead of retrying
a failed upload it would cause duplicity to fail the backup. You can find
https://code.launchpad.net/~noizyland/duplicity/fix_azurebackend_typo
Please test it and let me know if it helps.
-Scott
Post by c***@ccs.covici.com
Well, here are the first few lines from the log when I did the restart.
Since I can't see my own posts, I can't answer the question about the
traceback.
Using archive dir: /root/.cache/duplicity/backup_linux
Using backup name: backup_linux
Import of duplicity.backends.acdclibackend Succeeded
Import of duplicity.backends.azurebackend Succeeded
Import of duplicity.backends.b2backend Succeeded
Import of duplicity.backends.botobackend Succeeded
Import of duplicity.backends.cfbackend Succeeded
Import of duplicity.backends.copycombackend Succeeded
Import of duplicity.backends.dpbxbackend Failed: No module named dropbox
Import of duplicity.backends.gdocsbackend Succeeded
Import of duplicity.backends.giobackend Succeeded
Import of duplicity.backends.hsibackend Succeeded
Import of duplicity.backends.hubicbackend Succeeded
Import of duplicity.backends.imapbackend Succeeded
Import of duplicity.backends.lftpbackend Succeeded
Import of duplicity.backends.localbackend Succeeded
Import of duplicity.backends.mediafirebackend Succeeded
Import of duplicity.backends.megabackend Succeeded
Import of duplicity.backends.multibackend Succeeded
Import of duplicity.backends.ncftpbackend Succeeded
Import of duplicity.backends.onedrivebackend Succeeded
Import of duplicity.backends.par2backend Succeeded
Import of duplicity.backends.pydrivebackend Succeeded
Import of duplicity.backends.rsyncbackend Succeeded
Import of duplicity.backends.ssh_paramiko_backend Succeeded
Import of duplicity.backends.ssh_pexpect_backend Succeeded
Import of duplicity.backends.swiftbackend Succeeded
Import of duplicity.backends.sxbackend Succeeded
Import of duplicity.backends.tahoebackend Succeeded
Import of duplicity.backends.webdavbackend Succeeded
Reading globbing filelist /etc/azure_excludes.txt
Main action: inc
================================================================================
duplicity 0.7.07 (April 10, 2016)
Args: /usr/bin/duplicity -v info --volsize 300 --gpg-options
--pinentry-mode loopback --exclude-filelist /etc/azure_excludes.txt
Linux ccs.covici.com 4.1.17-gentoo #1 SMP PREEMPT Mon Feb 15 15:05:32
/usr/lib/python-exec/python2.7/python2 2.7.11 (default, Jan 29 2016,
22:32:51)
[GCC 4.9.3]
================================================================================
Using temporary directory /tmp/duplicity-K67URk-tempdir
Temp has 37554356224 available, backup will use approx 408944640.
Local and Remote metadata are synchronized, no sync needed.
Last full backup left a partial set, restarting.
Last full backup date: Fri May 6 16:24:16 2016
RESTART: Volumes 45955 to 45955 failed to upload before termination.
Restarting backup at volume 45955.
Deleting /tmp/duplicity-K67URk-tempdir/mktemp-gLMnDl-2
Restarting after volume 45954, file
var/www/
covici.com/htdocs-secure/owncloud/data/mattguice/files/20151114Town
Hall Meeting (Choral Prelude).mp3, block 977
A .
A audio
A audio/.mp3crc
A audio/0015.ps
And so on and so forth.
Post by Kenneth Loafman
One quick question about the format of the traceback you show... is that
because of the email client you use, or does it show up that way on the
console?
I'd like to see the block of data at the first of the log, say the first
200 lines. Please dump in raw text format.
Post by c***@ccs.covici.com
I have thousands of files, some large and some small, its about a
terabytes worth. It stopped pretty near the end and it just stopped
again, so my emphasis wuld be to fix the error and maybe do something
about optimizing the program.
Post by Kenneth Loafman
Do you have a lot of very small files? This should not take too
long.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Well, what happens is that it says A and all the filenames, but no
reference to a tar file, so its not really backing up again, this
is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
why
Post by Kenneth Loafman
Post by c***@ccs.covici.com
I was wondering what it is doing. The part of the program which
seems
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
"""
Fake writing to backend, but do go through all the source
paths.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Stop when we have processed the last file and block
from the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last backup. Normal backup will proceed at
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
start of
the
next volume in the set.
@type tarblock_iter: tarblock_iter
@param tarblock_iter: iterator for current tar block
@rtype: int
@return: constant 0 (zero)
"""
last_index = globals.restart.last_index
last_block =
globals.restart.last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# Just spin our wheels
iter_result =
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
tarblock_iter.next()
if
(tarblock_iter.previous_index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
==
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# If both the previous index and this
index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
are
Post by Kenneth Loafman
Post by c***@ccs.covici.com
done, exit now
# before we hit the next index, to
prevent
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
skipping its first
# block.
if not
last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
and
Post by Kenneth Loafman
Post by c***@ccs.covici.com
not
break
# Only
check
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
block number if
last_block is also a number
if last_block and
tarblock_iter.previous_block >
Post by Kenneth Loafman
Post by c***@ccs.covici.com
break
if
tarblock_iter.previous_index
log.Warn(_("File %s complete in backup
set.\n"
"Continuing
Post by Kenneth Loafman
Post by c***@ccs.covici.com
restart
on file %s.") %
Post by Kenneth Loafman
It sounds like the backup restarted completely, rather than at
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
known spot. What do the first 50 or so lines of the log contain?
...Ken
Post by c***@ccs.covici.com
But the question is why does duplicity need to do that at all?
Couldn't it go through the last hundred or so or some
reasonable
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
number?
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Also, what about the original error which caused the restart
in the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
first place? What is up with that that?
Post by Kenneth Loafman
No, there is no way to get rid of the loop, whichever one
you're
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
talking
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
about, there are a lot of them.
The main suggestion I have for anyone with multi-hour
backups is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar tool.
The second suggestion I have is to chunk the backup into
reasonably
Post by Kenneth Loafman
Post by c***@ccs.covici.com
sized
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
portions, 200-500GB each.
...Ken
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going through
every
Post by Kenneth Loafman
Post by c***@ccs.covici.com
file
Post by Kenneth Loafman
Post by c***@ccs.covici.com
it
Post by Kenneth Loafman
Post by c***@ccs.covici.com
had and it is taking 18 hours and its not done yet. I see
a
Post by Kenneth Loafman
Post by c***@ccs.covici.com
loop
Post by Kenneth Loafman
Post by c***@ccs.covici.com
in
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
there and I wonder if there is any way to shorten the time
or
Post by Kenneth Loafman
Post by c***@ccs.covici.com
get
Post by Kenneth Loafman
Post by c***@ccs.covici.com
rid
Post by Kenneth Loafman
Post by c***@ccs.covici.com
of
Post by Kenneth Loafman
Post by c***@ccs.covici.com
the loop altoggether?
Thanks in advance for any suggestions.
--
Your life is like a penny. You're going to lose it. The
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question is:
How do
you spend it?

John Covici
***@ccs.covici.com
Scott McKenzie
2016-05-12 03:39:44 UTC
Permalink
You can view the patch here:
http://bazaar.launchpad.net/~noizyland/duplicity/fix_azurebackend_typo/revision/1213#duplicity/backends/azurebackend.py
Post by c***@ccs.covici.com
How do I get this branch, I have installed this on gentoo using a tar
archiv in the download directory. Can you give me the patch, so I can
fix by hand?
Also, I wonder why an upload should fail at all since there is no space
limit?
Thanks much.
Post by Scott McKenzie
Hi covici
I am not sure why the restart is taking so long, but I found a bug in the
azurebackend that was causing it to tail occasionally. Instead of
retrying
Post by Scott McKenzie
a failed upload it would cause duplicity to fail the backup. You can
find
Post by Scott McKenzie
https://code.launchpad.net/~noizyland/duplicity/fix_azurebackend_typo
Please test it and let me know if it helps.
-Scott
Post by c***@ccs.covici.com
Well, here are the first few lines from the log when I did the restart.
Since I can't see my own posts, I can't answer the question about the
traceback.
Using archive dir: /root/.cache/duplicity/backup_linux
Using backup name: backup_linux
Import of duplicity.backends.acdclibackend Succeeded
Import of duplicity.backends.azurebackend Succeeded
Import of duplicity.backends.b2backend Succeeded
Import of duplicity.backends.botobackend Succeeded
Import of duplicity.backends.cfbackend Succeeded
Import of duplicity.backends.copycombackend Succeeded
Import of duplicity.backends.dpbxbackend Failed: No module named
dropbox
Post by Scott McKenzie
Post by c***@ccs.covici.com
Import of duplicity.backends.gdocsbackend Succeeded
Import of duplicity.backends.giobackend Succeeded
Import of duplicity.backends.hsibackend Succeeded
Import of duplicity.backends.hubicbackend Succeeded
Import of duplicity.backends.imapbackend Succeeded
Import of duplicity.backends.lftpbackend Succeeded
Import of duplicity.backends.localbackend Succeeded
Import of duplicity.backends.mediafirebackend Succeeded
Import of duplicity.backends.megabackend Succeeded
Import of duplicity.backends.multibackend Succeeded
Import of duplicity.backends.ncftpbackend Succeeded
Import of duplicity.backends.onedrivebackend Succeeded
Import of duplicity.backends.par2backend Succeeded
Import of duplicity.backends.pydrivebackend Succeeded
Import of duplicity.backends.rsyncbackend Succeeded
Import of duplicity.backends.ssh_paramiko_backend Succeeded
Import of duplicity.backends.ssh_pexpect_backend Succeeded
Import of duplicity.backends.swiftbackend Succeeded
Import of duplicity.backends.sxbackend Succeeded
Import of duplicity.backends.tahoebackend Succeeded
Import of duplicity.backends.webdavbackend Succeeded
Reading globbing filelist /etc/azure_excludes.txt
Main action: inc
================================================================================
Post by Scott McKenzie
Post by c***@ccs.covici.com
duplicity 0.7.07 (April 10, 2016)
Args: /usr/bin/duplicity -v info --volsize 300 --gpg-options
--pinentry-mode loopback --exclude-filelist /etc/azure_excludes.txt
Linux ccs.covici.com 4.1.17-gentoo #1 SMP PREEMPT Mon Feb 15 15:05:32
/usr/lib/python-exec/python2.7/python2 2.7.11 (default, Jan 29 2016,
22:32:51)
[GCC 4.9.3]
================================================================================
Post by Scott McKenzie
Post by c***@ccs.covici.com
Using temporary directory /tmp/duplicity-K67URk-tempdir
Temp has 37554356224 available, backup will use approx 408944640.
Local and Remote metadata are synchronized, no sync needed.
Last full backup left a partial set, restarting.
Last full backup date: Fri May 6 16:24:16 2016
RESTART: Volumes 45955 to 45955 failed to upload before termination.
Restarting backup at volume 45955.
Deleting /tmp/duplicity-K67URk-tempdir/mktemp-gLMnDl-2
Restarting after volume 45954, file
var/www/
covici.com/htdocs-secure/owncloud/data/mattguice/files/20151114Town
Hall Meeting (Choral Prelude).mp3, block 977
A .
A audio
A audio/.mp3crc
A audio/0015.ps
And so on and so forth.
Post by Kenneth Loafman
One quick question about the format of the traceback you show... is
that
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
because of the email client you use, or does it show up that way on
the
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
console?
I'd like to see the block of data at the first of the log, say the
first
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
200 lines. Please dump in raw text format.
Post by c***@ccs.covici.com
I have thousands of files, some large and some small, its about a
terabytes worth. It stopped pretty near the end and it just
stopped
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
again, so my emphasis wuld be to fix the error and maybe do
something
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
about optimizing the program.
Post by Kenneth Loafman
Do you have a lot of very small files? This should not take too
long.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Well, what happens is that it says A and all the filenames,
but no
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
reference to a tar file, so its not really backing up again,
this
Post by Scott McKenzie
Post by c***@ccs.covici.com
is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
why
Post by Kenneth Loafman
Post by c***@ccs.covici.com
I was wondering what it is doing. The part of the program
which
Post by Scott McKenzie
Post by c***@ccs.covici.com
seems
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
"""
Fake writing to backend, but do go through all the
source
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
paths.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Stop when we have processed the last file and
block
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
from the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last backup. Normal backup will proceed
at
Post by Scott McKenzie
Post by c***@ccs.covici.com
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
start of
the
next volume in the set.
@type tarblock_iter: tarblock_iter
@param tarblock_iter: iterator for current tar block
@rtype: int
@return: constant 0 (zero)
"""
last_index = globals.restart.last_index
last_block =
globals.restart.last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# Just spin our
wheels
Post by Scott McKenzie
Post by c***@ccs.covici.com
iter_result =
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
tarblock_iter.next()
if
(tarblock_iter.previous_index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
==
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# If both the previous index and
this
Post by Scott McKenzie
Post by c***@ccs.covici.com
index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
are
Post by Kenneth Loafman
Post by c***@ccs.covici.com
done, exit now
# before we hit the next index, to
prevent
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
skipping its first
# block.
if not
last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
and
Post by Kenneth Loafman
Post by c***@ccs.covici.com
not
break
#
Only
Post by Scott McKenzie
Post by c***@ccs.covici.com
check
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
block number if
last_block is also a number
if last_block and
tarblock_iter.previous_block >
Post by Kenneth Loafman
Post by c***@ccs.covici.com
break
if
tarblock_iter.previous_index
log.Warn(_("File %s complete in
backup
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
set.\n"
"Continuing
Post by Kenneth Loafman
Post by c***@ccs.covici.com
restart
on file %s.") %
Post by Kenneth Loafman
It sounds like the backup restarted completely, rather than
at
Post by Scott McKenzie
Post by c***@ccs.covici.com
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
known spot. What do the first 50 or so lines of the log
contain?
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
...Ken
Post by c***@ccs.covici.com
But the question is why does duplicity need to do that at
all?
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Couldn't it go through the last hundred or so or some
reasonable
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
number?
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Also, what about the original error which caused the
restart
Post by Scott McKenzie
Post by c***@ccs.covici.com
in the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
first place? What is up with that that?
Post by Kenneth Loafman
No, there is no way to get rid of the loop, whichever one
you're
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
talking
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
about, there are a lot of them.
The main suggestion I have for anyone with multi-hour
backups is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar tool.
The second suggestion I have is to chunk the backup into
reasonably
Post by Kenneth Loafman
Post by c***@ccs.covici.com
sized
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
portions, 200-500GB each.
...Ken
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going
through
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
every
Post by Kenneth Loafman
Post by c***@ccs.covici.com
file
Post by Kenneth Loafman
Post by c***@ccs.covici.com
it
Post by Kenneth Loafman
Post by c***@ccs.covici.com
had and it is taking 18 hours and its not done yet. I
see
Post by Scott McKenzie
Post by c***@ccs.covici.com
a
Post by Kenneth Loafman
Post by c***@ccs.covici.com
loop
Post by Kenneth Loafman
Post by c***@ccs.covici.com
in
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
there and I wonder if there is any way to shorten the
time
Post by Scott McKenzie
Post by c***@ccs.covici.com
or
Post by Kenneth Loafman
Post by c***@ccs.covici.com
get
Post by Kenneth Loafman
Post by c***@ccs.covici.com
rid
Post by Kenneth Loafman
Post by c***@ccs.covici.com
of
Post by Kenneth Loafman
Post by c***@ccs.covici.com
the loop altoggether?
Thanks in advance for any suggestions.
--
Your life is like a penny. You're going to lose it.
The
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The
question
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
c***@ccs.covici.com
2016-05-12 04:14:34 UTC
Permalink
OK, thanks.
Post by Scott McKenzie
http://bazaar.launchpad.net/~noizyland/duplicity/fix_azurebackend_typo/revision/1213#duplicity/backends/azurebackend.py
Post by c***@ccs.covici.com
How do I get this branch, I have installed this on gentoo using a tar
archiv in the download directory. Can you give me the patch, so I can
fix by hand?
Also, I wonder why an upload should fail at all since there is no space
limit?
Thanks much.
Post by Scott McKenzie
Hi covici
I am not sure why the restart is taking so long, but I found a bug in the
azurebackend that was causing it to tail occasionally. Instead of
retrying
Post by Scott McKenzie
a failed upload it would cause duplicity to fail the backup. You can
find
Post by Scott McKenzie
https://code.launchpad.net/~noizyland/duplicity/fix_azurebackend_typo
Please test it and let me know if it helps.
-Scott
Post by c***@ccs.covici.com
Well, here are the first few lines from the log when I did the restart.
Since I can't see my own posts, I can't answer the question about the
traceback.
Using archive dir: /root/.cache/duplicity/backup_linux
Using backup name: backup_linux
Import of duplicity.backends.acdclibackend Succeeded
Import of duplicity.backends.azurebackend Succeeded
Import of duplicity.backends.b2backend Succeeded
Import of duplicity.backends.botobackend Succeeded
Import of duplicity.backends.cfbackend Succeeded
Import of duplicity.backends.copycombackend Succeeded
Import of duplicity.backends.dpbxbackend Failed: No module named
dropbox
Post by Scott McKenzie
Post by c***@ccs.covici.com
Import of duplicity.backends.gdocsbackend Succeeded
Import of duplicity.backends.giobackend Succeeded
Import of duplicity.backends.hsibackend Succeeded
Import of duplicity.backends.hubicbackend Succeeded
Import of duplicity.backends.imapbackend Succeeded
Import of duplicity.backends.lftpbackend Succeeded
Import of duplicity.backends.localbackend Succeeded
Import of duplicity.backends.mediafirebackend Succeeded
Import of duplicity.backends.megabackend Succeeded
Import of duplicity.backends.multibackend Succeeded
Import of duplicity.backends.ncftpbackend Succeeded
Import of duplicity.backends.onedrivebackend Succeeded
Import of duplicity.backends.par2backend Succeeded
Import of duplicity.backends.pydrivebackend Succeeded
Import of duplicity.backends.rsyncbackend Succeeded
Import of duplicity.backends.ssh_paramiko_backend Succeeded
Import of duplicity.backends.ssh_pexpect_backend Succeeded
Import of duplicity.backends.swiftbackend Succeeded
Import of duplicity.backends.sxbackend Succeeded
Import of duplicity.backends.tahoebackend Succeeded
Import of duplicity.backends.webdavbackend Succeeded
Reading globbing filelist /etc/azure_excludes.txt
Main action: inc
================================================================================
Post by Scott McKenzie
Post by c***@ccs.covici.com
duplicity 0.7.07 (April 10, 2016)
Args: /usr/bin/duplicity -v info --volsize 300 --gpg-options
--pinentry-mode loopback --exclude-filelist /etc/azure_excludes.txt
Linux ccs.covici.com 4.1.17-gentoo #1 SMP PREEMPT Mon Feb 15 15:05:32
/usr/lib/python-exec/python2.7/python2 2.7.11 (default, Jan 29 2016,
22:32:51)
[GCC 4.9.3]
================================================================================
Post by Scott McKenzie
Post by c***@ccs.covici.com
Using temporary directory /tmp/duplicity-K67URk-tempdir
Temp has 37554356224 available, backup will use approx 408944640.
Local and Remote metadata are synchronized, no sync needed.
Last full backup left a partial set, restarting.
Last full backup date: Fri May 6 16:24:16 2016
RESTART: Volumes 45955 to 45955 failed to upload before termination.
Restarting backup at volume 45955.
Deleting /tmp/duplicity-K67URk-tempdir/mktemp-gLMnDl-2
Restarting after volume 45954, file
var/www/
covici.com/htdocs-secure/owncloud/data/mattguice/files/20151114Town
Hall Meeting (Choral Prelude).mp3, block 977
A .
A audio
A audio/.mp3crc
A audio/0015.ps
And so on and so forth.
Post by Kenneth Loafman
One quick question about the format of the traceback you show... is
that
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
because of the email client you use, or does it show up that way on
the
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
console?
I'd like to see the block of data at the first of the log, say the
first
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
200 lines. Please dump in raw text format.
Post by c***@ccs.covici.com
I have thousands of files, some large and some small, its about a
terabytes worth. It stopped pretty near the end and it just
stopped
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
again, so my emphasis wuld be to fix the error and maybe do
something
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
about optimizing the program.
Post by Kenneth Loafman
Do you have a lot of very small files? This should not take too
long.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Well, what happens is that it says A and all the filenames,
but no
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
reference to a tar file, so its not really backing up again,
this
Post by Scott McKenzie
Post by c***@ccs.covici.com
is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
why
Post by Kenneth Loafman
Post by c***@ccs.covici.com
I was wondering what it is doing. The part of the program
which
Post by Scott McKenzie
Post by c***@ccs.covici.com
seems
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
"""
Fake writing to backend, but do go through all the
source
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
paths.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Stop when we have processed the last file and
block
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
from the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last backup. Normal backup will proceed
at
Post by Scott McKenzie
Post by c***@ccs.covici.com
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
start of
the
next volume in the set.
@type tarblock_iter: tarblock_iter
@param tarblock_iter: iterator for current tar block
@rtype: int
@return: constant 0 (zero)
"""
last_index = globals.restart.last_index
last_block =
globals.restart.last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# Just spin our
wheels
Post by Scott McKenzie
Post by c***@ccs.covici.com
iter_result =
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
tarblock_iter.next()
if
(tarblock_iter.previous_index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
==
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# If both the previous index and
this
Post by Scott McKenzie
Post by c***@ccs.covici.com
index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
are
Post by Kenneth Loafman
Post by c***@ccs.covici.com
done, exit now
# before we hit the next index, to
prevent
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
skipping its first
# block.
if not
last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
and
Post by Kenneth Loafman
Post by c***@ccs.covici.com
not
break
#
Only
Post by Scott McKenzie
Post by c***@ccs.covici.com
check
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
block number if
last_block is also a number
if last_block and
tarblock_iter.previous_block >
Post by Kenneth Loafman
Post by c***@ccs.covici.com
break
if
tarblock_iter.previous_index
log.Warn(_("File %s complete in
backup
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
set.\n"
"Continuing
Post by Kenneth Loafman
Post by c***@ccs.covici.com
restart
on file %s.") %
Post by Kenneth Loafman
It sounds like the backup restarted completely, rather than
at
Post by Scott McKenzie
Post by c***@ccs.covici.com
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
known spot. What do the first 50 or so lines of the log
contain?
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
...Ken
Post by c***@ccs.covici.com
But the question is why does duplicity need to do that at
all?
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Couldn't it go through the last hundred or so or some
reasonable
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
number?
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Also, what about the original error which caused the
restart
Post by Scott McKenzie
Post by c***@ccs.covici.com
in the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
first place? What is up with that that?
Post by Kenneth Loafman
No, there is no way to get rid of the loop, whichever one
you're
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
talking
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
about, there are a lot of them.
The main suggestion I have for anyone with multi-hour
backups is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar tool.
The second suggestion I have is to chunk the backup into
reasonably
Post by Kenneth Loafman
Post by c***@ccs.covici.com
sized
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
portions, 200-500GB each.
...Ken
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going
through
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
every
Post by Kenneth Loafman
Post by c***@ccs.covici.com
file
Post by Kenneth Loafman
Post by c***@ccs.covici.com
it
Post by Kenneth Loafman
Post by c***@ccs.covici.com
had and it is taking 18 hours and its not done yet. I
see
Post by Scott McKenzie
Post by c***@ccs.covici.com
a
Post by Kenneth Loafman
Post by c***@ccs.covici.com
loop
Post by Kenneth Loafman
Post by c***@ccs.covici.com
in
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
there and I wonder if there is any way to shorten the
time
Post by Scott McKenzie
Post by c***@ccs.covici.com
or
Post by Kenneth Loafman
Post by c***@ccs.covici.com
get
Post by Kenneth Loafman
Post by c***@ccs.covici.com
rid
Post by Kenneth Loafman
Post by c***@ccs.covici.com
of
Post by Kenneth Loafman
Post by c***@ccs.covici.com
the loop altoggether?
Thanks in advance for any suggestions.
--
Your life is like a penny. You're going to lose it.
The
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The
question
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question is:
How do
you spend it?

John Covici
***@ccs.covici.com
Kenneth Loafman
2016-05-12 11:39:03 UTC
Permalink
@Scott, would you please mark your branch as 'propose to merge'.

...Thanks,
...Ken
Post by c***@ccs.covici.com
OK, thanks.
http://bazaar.launchpad.net/~noizyland/duplicity/fix_azurebackend_typo/revision/1213#duplicity/backends/azurebackend.py
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do I get this branch, I have installed this on gentoo using a tar
archiv in the download directory. Can you give me the patch, so I can
fix by hand?
Also, I wonder why an upload should fail at all since there is no space
limit?
Thanks much.
Post by Scott McKenzie
Hi covici
I am not sure why the restart is taking so long, but I found a bug
in the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
azurebackend that was causing it to tail occasionally. Instead of
retrying
Post by Scott McKenzie
a failed upload it would cause duplicity to fail the backup. You can
find
https://code.launchpad.net/~noizyland/duplicity/fix_azurebackend_typo
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Please test it and let me know if it helps.
-Scott
Post by c***@ccs.covici.com
Well, here are the first few lines from the log when I did the
restart.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
Since I can't see my own posts, I can't answer the question about
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
traceback.
Using archive dir: /root/.cache/duplicity/backup_linux
Using backup name: backup_linux
Import of duplicity.backends.acdclibackend Succeeded
Import of duplicity.backends.azurebackend Succeeded
Import of duplicity.backends.b2backend Succeeded
Import of duplicity.backends.botobackend Succeeded
Import of duplicity.backends.cfbackend Succeeded
Import of duplicity.backends.copycombackend Succeeded
Import of duplicity.backends.dpbxbackend Failed: No module named
dropbox
Post by Scott McKenzie
Post by c***@ccs.covici.com
Import of duplicity.backends.gdocsbackend Succeeded
Import of duplicity.backends.giobackend Succeeded
Import of duplicity.backends.hsibackend Succeeded
Import of duplicity.backends.hubicbackend Succeeded
Import of duplicity.backends.imapbackend Succeeded
Import of duplicity.backends.lftpbackend Succeeded
Import of duplicity.backends.localbackend Succeeded
Import of duplicity.backends.mediafirebackend Succeeded
Import of duplicity.backends.megabackend Succeeded
Import of duplicity.backends.multibackend Succeeded
Import of duplicity.backends.ncftpbackend Succeeded
Import of duplicity.backends.onedrivebackend Succeeded
Import of duplicity.backends.par2backend Succeeded
Import of duplicity.backends.pydrivebackend Succeeded
Import of duplicity.backends.rsyncbackend Succeeded
Import of duplicity.backends.ssh_paramiko_backend Succeeded
Import of duplicity.backends.ssh_pexpect_backend Succeeded
Import of duplicity.backends.swiftbackend Succeeded
Import of duplicity.backends.sxbackend Succeeded
Import of duplicity.backends.tahoebackend Succeeded
Import of duplicity.backends.webdavbackend Succeeded
Reading globbing filelist /etc/azure_excludes.txt
Main action: inc
================================================================================
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
duplicity 0.7.07 (April 10, 2016)
Args: /usr/bin/duplicity -v info --volsize 300 --gpg-options
--pinentry-mode loopback --exclude-filelist /etc/azure_excludes.txt
azure://linux
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
Linux ccs.covici.com 4.1.17-gentoo #1 SMP PREEMPT Mon Feb 15
15:05:32
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
/usr/lib/python-exec/python2.7/python2 2.7.11 (default, Jan 29
2016,
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
22:32:51)
[GCC 4.9.3]
================================================================================
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
Using temporary directory /tmp/duplicity-K67URk-tempdir
Temp has 37554356224 available, backup will use approx 408944640.
Local and Remote metadata are synchronized, no sync needed.
Last full backup left a partial set, restarting.
Last full backup date: Fri May 6 16:24:16 2016
RESTART: Volumes 45955 to 45955 failed to upload before
termination.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
Restarting backup at volume 45955.
Deleting /tmp/duplicity-K67URk-tempdir/mktemp-gLMnDl-2
Restarting after volume 45954, file
var/www/
covici.com/htdocs-secure/owncloud/data/mattguice/files/20151114Town
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
Hall Meeting (Choral Prelude).mp3, block 977
A .
A audio
A audio/.mp3crc
A audio/0015.ps
And so on and so forth.
Post by Kenneth Loafman
One quick question about the format of the traceback you show...
is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
that
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
because of the email client you use, or does it show up that way
on
Post by Kenneth Loafman
Post by c***@ccs.covici.com
the
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
console?
I'd like to see the block of data at the first of the log, say
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
first
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
200 lines. Please dump in raw text format.
Post by c***@ccs.covici.com
I have thousands of files, some large and some small, its
about a
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
terabytes worth. It stopped pretty near the end and it just
stopped
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
again, so my emphasis wuld be to fix the error and maybe do
something
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
about optimizing the program.
Post by Kenneth Loafman
Do you have a lot of very small files? This should not take
too
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
long.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Well, what happens is that it says A and all the filenames,
but no
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
reference to a tar file, so its not really backing up
again,
Post by Kenneth Loafman
Post by c***@ccs.covici.com
this
Post by Scott McKenzie
Post by c***@ccs.covici.com
is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
why
Post by Kenneth Loafman
Post by c***@ccs.covici.com
I was wondering what it is doing. The part of the program
which
Post by Scott McKenzie
Post by c***@ccs.covici.com
seems
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
"""
Fake writing to backend, but do go through all the
source
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
paths.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Stop when we have processed the last file
and
Post by Kenneth Loafman
Post by c***@ccs.covici.com
block
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
from the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last backup. Normal backup will
proceed
Post by Kenneth Loafman
Post by c***@ccs.covici.com
at
Post by Scott McKenzie
Post by c***@ccs.covici.com
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
start of
the
next volume in the set.
@type tarblock_iter: tarblock_iter
@param tarblock_iter: iterator for current tar
block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
@rtype: int
@return: constant 0 (zero)
"""
last_index =
globals.restart.last_index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last_block =
globals.restart.last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# Just spin our
wheels
Post by Scott McKenzie
Post by c***@ccs.covici.com
iter_result =
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
tarblock_iter.next()
if
(tarblock_iter.previous_index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
==
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# If both the previous index and
this
Post by Scott McKenzie
Post by c***@ccs.covici.com
index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
are
Post by Kenneth Loafman
Post by c***@ccs.covici.com
done, exit now
# before we hit the next index,
to
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
prevent
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
skipping its first
# block.
if not
last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
and
Post by Kenneth Loafman
Post by c***@ccs.covici.com
not
break
#
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Only
Post by Scott McKenzie
Post by c***@ccs.covici.com
check
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
block number if
last_block is also a number
if last_block and
tarblock_iter.previous_block >
Post by Kenneth Loafman
Post by c***@ccs.covici.com
break
if
tarblock_iter.previous_index
log.Warn(_("File %s complete in
backup
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
set.\n"
"Continuing
Post by Kenneth Loafman
Post by c***@ccs.covici.com
restart
on file %s.") %
Post by Kenneth Loafman
It sounds like the backup restarted completely, rather
than
Post by Kenneth Loafman
Post by c***@ccs.covici.com
at
Post by Scott McKenzie
Post by c***@ccs.covici.com
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
known spot. What do the first 50 or so lines of the log
contain?
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
...Ken
Post by c***@ccs.covici.com
But the question is why does duplicity need to do
that at
Post by Kenneth Loafman
Post by c***@ccs.covici.com
all?
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Couldn't it go through the last hundred or so or some
reasonable
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
number?
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Also, what about the original error which caused the
restart
Post by Scott McKenzie
Post by c***@ccs.covici.com
in the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
first place? What is up with that that?
Post by Kenneth Loafman
No, there is no way to get rid of the loop,
whichever one
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
you're
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
talking
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
about, there are a lot of them.
The main suggestion I have for anyone with multi-hour
backups is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar
tool.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
The second suggestion I have is to chunk the backup
into
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
reasonably
Post by Kenneth Loafman
Post by c***@ccs.covici.com
sized
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
portions, 200-500GB each.
...Ken
On Tue, May 10, 2016 at 7:02 AM, <
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going
through
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
every
Post by Kenneth Loafman
Post by c***@ccs.covici.com
file
Post by Kenneth Loafman
Post by c***@ccs.covici.com
it
Post by Kenneth Loafman
Post by c***@ccs.covici.com
had and it is taking 18 hours and its not done
yet. I
Post by Kenneth Loafman
Post by c***@ccs.covici.com
see
Post by Scott McKenzie
Post by c***@ccs.covici.com
a
Post by Kenneth Loafman
Post by c***@ccs.covici.com
loop
Post by Kenneth Loafman
Post by c***@ccs.covici.com
in
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
there and I wonder if there is any way to shorten
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
time
Post by Scott McKenzie
Post by c***@ccs.covici.com
or
Post by Kenneth Loafman
Post by c***@ccs.covici.com
get
Post by Kenneth Loafman
Post by c***@ccs.covici.com
rid
Post by Kenneth Loafman
Post by c***@ccs.covici.com
of
Post by Kenneth Loafman
Post by c***@ccs.covici.com
the loop altoggether?
Thanks in advance for any suggestions.
--
Your life is like a penny. You're going to lose
it.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
The
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
--
Your life is like a penny. You're going to lose it.
The
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The
question
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
c***@ccs.covici.com
2016-05-12 18:02:53 UTC
Permalink
Well,I think it finished all the volumes , but died during writing of
signatures -- so I restarted and for more than 3hours its doing
something, last lines of its log are:
Synchronizing remote metadata to local cache...
Deleting local
/root/.cache/duplicity/backup_linux/duplicity-full-signatures.20160506T202416Z.sigtar.gpg
(not authoritative at backend).

I hope its doing the right thing. That directory contains the
following

-rw-r--r-- 2 root root 0 May 12 10:39
ccs.covici.com.108584777488522569313094
-rw------- 1 root root 228M May 12 10:27
duplicity-full.20160506T202416Z.manifest.part
-rw------- 1 root root 13G May 12 09:38
duplicity-full-signatures.20160506T202416Z.sigtar.part
-rw-r--r-- 2 root root 0 May 12 10:39 lockfile.lock
but it has not changed since 10:38.
Post by Scott McKenzie
http://bazaar.launchpad.net/~noizyland/duplicity/fix_azurebackend_typo/revision/1213#duplicity/backends/azurebackend.py
Post by c***@ccs.covici.com
How do I get this branch, I have installed this on gentoo using a tar
archiv in the download directory. Can you give me the patch, so I can
fix by hand?
Also, I wonder why an upload should fail at all since there is no space
limit?
Thanks much.
Post by Scott McKenzie
Hi covici
I am not sure why the restart is taking so long, but I found a bug in the
azurebackend that was causing it to tail occasionally. Instead of
retrying
Post by Scott McKenzie
a failed upload it would cause duplicity to fail the backup. You can
find
Post by Scott McKenzie
https://code.launchpad.net/~noizyland/duplicity/fix_azurebackend_typo
Please test it and let me know if it helps.
-Scott
Post by c***@ccs.covici.com
Well, here are the first few lines from the log when I did the restart.
Since I can't see my own posts, I can't answer the question about the
traceback.
Using archive dir: /root/.cache/duplicity/backup_linux
Using backup name: backup_linux
Import of duplicity.backends.acdclibackend Succeeded
Import of duplicity.backends.azurebackend Succeeded
Import of duplicity.backends.b2backend Succeeded
Import of duplicity.backends.botobackend Succeeded
Import of duplicity.backends.cfbackend Succeeded
Import of duplicity.backends.copycombackend Succeeded
Import of duplicity.backends.dpbxbackend Failed: No module named
dropbox
Post by Scott McKenzie
Post by c***@ccs.covici.com
Import of duplicity.backends.gdocsbackend Succeeded
Import of duplicity.backends.giobackend Succeeded
Import of duplicity.backends.hsibackend Succeeded
Import of duplicity.backends.hubicbackend Succeeded
Import of duplicity.backends.imapbackend Succeeded
Import of duplicity.backends.lftpbackend Succeeded
Import of duplicity.backends.localbackend Succeeded
Import of duplicity.backends.mediafirebackend Succeeded
Import of duplicity.backends.megabackend Succeeded
Import of duplicity.backends.multibackend Succeeded
Import of duplicity.backends.ncftpbackend Succeeded
Import of duplicity.backends.onedrivebackend Succeeded
Import of duplicity.backends.par2backend Succeeded
Import of duplicity.backends.pydrivebackend Succeeded
Import of duplicity.backends.rsyncbackend Succeeded
Import of duplicity.backends.ssh_paramiko_backend Succeeded
Import of duplicity.backends.ssh_pexpect_backend Succeeded
Import of duplicity.backends.swiftbackend Succeeded
Import of duplicity.backends.sxbackend Succeeded
Import of duplicity.backends.tahoebackend Succeeded
Import of duplicity.backends.webdavbackend Succeeded
Reading globbing filelist /etc/azure_excludes.txt
Main action: inc
================================================================================
Post by Scott McKenzie
Post by c***@ccs.covici.com
duplicity 0.7.07 (April 10, 2016)
Args: /usr/bin/duplicity -v info --volsize 300 --gpg-options
--pinentry-mode loopback --exclude-filelist /etc/azure_excludes.txt
Linux ccs.covici.com 4.1.17-gentoo #1 SMP PREEMPT Mon Feb 15 15:05:32
/usr/lib/python-exec/python2.7/python2 2.7.11 (default, Jan 29 2016,
22:32:51)
[GCC 4.9.3]
================================================================================
Post by Scott McKenzie
Post by c***@ccs.covici.com
Using temporary directory /tmp/duplicity-K67URk-tempdir
Temp has 37554356224 available, backup will use approx 408944640.
Local and Remote metadata are synchronized, no sync needed.
Last full backup left a partial set, restarting.
Last full backup date: Fri May 6 16:24:16 2016
RESTART: Volumes 45955 to 45955 failed to upload before termination.
Restarting backup at volume 45955.
Deleting /tmp/duplicity-K67URk-tempdir/mktemp-gLMnDl-2
Restarting after volume 45954, file
var/www/
covici.com/htdocs-secure/owncloud/data/mattguice/files/20151114Town
Hall Meeting (Choral Prelude).mp3, block 977
A .
A audio
A audio/.mp3crc
A audio/0015.ps
And so on and so forth.
Post by Kenneth Loafman
One quick question about the format of the traceback you show... is
that
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
because of the email client you use, or does it show up that way on
the
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
console?
I'd like to see the block of data at the first of the log, say the
first
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
200 lines. Please dump in raw text format.
Post by c***@ccs.covici.com
I have thousands of files, some large and some small, its about a
terabytes worth. It stopped pretty near the end and it just
stopped
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
again, so my emphasis wuld be to fix the error and maybe do
something
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
about optimizing the program.
Post by Kenneth Loafman
Do you have a lot of very small files? This should not take too
long.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Well, what happens is that it says A and all the filenames,
but no
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
reference to a tar file, so its not really backing up again,
this
Post by Scott McKenzie
Post by c***@ccs.covici.com
is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
why
Post by Kenneth Loafman
Post by c***@ccs.covici.com
I was wondering what it is doing. The part of the program
which
Post by Scott McKenzie
Post by c***@ccs.covici.com
seems
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
"""
Fake writing to backend, but do go through all the
source
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
paths.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Stop when we have processed the last file and
block
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
from the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last backup. Normal backup will proceed
at
Post by Scott McKenzie
Post by c***@ccs.covici.com
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
start of
the
next volume in the set.
@type tarblock_iter: tarblock_iter
@param tarblock_iter: iterator for current tar block
@rtype: int
@return: constant 0 (zero)
"""
last_index = globals.restart.last_index
last_block =
globals.restart.last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# Just spin our
wheels
Post by Scott McKenzie
Post by c***@ccs.covici.com
iter_result =
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
tarblock_iter.next()
if
(tarblock_iter.previous_index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
==
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# If both the previous index and
this
Post by Scott McKenzie
Post by c***@ccs.covici.com
index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
are
Post by Kenneth Loafman
Post by c***@ccs.covici.com
done, exit now
# before we hit the next index, to
prevent
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
skipping its first
# block.
if not
last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
and
Post by Kenneth Loafman
Post by c***@ccs.covici.com
not
break
#
Only
Post by Scott McKenzie
Post by c***@ccs.covici.com
check
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
block number if
last_block is also a number
if last_block and
tarblock_iter.previous_block >
Post by Kenneth Loafman
Post by c***@ccs.covici.com
break
if
tarblock_iter.previous_index
log.Warn(_("File %s complete in
backup
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
set.\n"
"Continuing
Post by Kenneth Loafman
Post by c***@ccs.covici.com
restart
on file %s.") %
Post by Kenneth Loafman
It sounds like the backup restarted completely, rather than
at
Post by Scott McKenzie
Post by c***@ccs.covici.com
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
known spot. What do the first 50 or so lines of the log
contain?
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
...Ken
Post by c***@ccs.covici.com
But the question is why does duplicity need to do that at
all?
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Couldn't it go through the last hundred or so or some
reasonable
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
number?
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Also, what about the original error which caused the
restart
Post by Scott McKenzie
Post by c***@ccs.covici.com
in the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
first place? What is up with that that?
Post by Kenneth Loafman
No, there is no way to get rid of the loop, whichever one
you're
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
talking
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
about, there are a lot of them.
The main suggestion I have for anyone with multi-hour
backups is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar tool.
The second suggestion I have is to chunk the backup into
reasonably
Post by Kenneth Loafman
Post by c***@ccs.covici.com
sized
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
portions, 200-500GB each.
...Ken
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going
through
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
every
Post by Kenneth Loafman
Post by c***@ccs.covici.com
file
Post by Kenneth Loafman
Post by c***@ccs.covici.com
it
Post by Kenneth Loafman
Post by c***@ccs.covici.com
had and it is taking 18 hours and its not done yet. I
see
Post by Scott McKenzie
Post by c***@ccs.covici.com
a
Post by Kenneth Loafman
Post by c***@ccs.covici.com
loop
Post by Kenneth Loafman
Post by c***@ccs.covici.com
in
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
there and I wonder if there is any way to shorten the
time
Post by Scott McKenzie
Post by c***@ccs.covici.com
or
Post by Kenneth Loafman
Post by c***@ccs.covici.com
get
Post by Kenneth Loafman
Post by c***@ccs.covici.com
rid
Post by Kenneth Loafman
Post by c***@ccs.covici.com
of
Post by Kenneth Loafman
Post by c***@ccs.covici.com
the loop altoggether?
Thanks in advance for any suggestions.
--
Your life is like a penny. You're going to lose it.
The
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The
question
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question is:
How do
you spend it?

John Covici
***@ccs.covici.com
c***@ccs.covici.com
2016-05-14 03:39:43 UTC
Permalink
Well, this time it has gotten up to the following and this was at 02:27
PM
Writing duplicity-full-signatures.20160506T202416Z.sigtar.gpg
Deleting
/root/.cache/duplicity/backup_linux/duplicity-full-signatures.20160506T202416Z.sigtar.gpg
Writing duplicity-full.20160506T202416Z.manifest.gpg
Deleting
/root/.cache/duplicity/backup_linux/duplicity-full.20160506T202416Z.manifest.gpg
and its just sitting there, using cpu time, but can anyone tell me what
is going on here? I don't want to kill the process in case its still
doing somehing useful.

Thanks in advance for any suggestions.
Post by Scott McKenzie
http://bazaar.launchpad.net/~noizyland/duplicity/fix_azurebackend_typo/revision/1213#duplicity/backends/azurebackend.py
Post by c***@ccs.covici.com
How do I get this branch, I have installed this on gentoo using a tar
archiv in the download directory. Can you give me the patch, so I can
fix by hand?
Also, I wonder why an upload should fail at all since there is no space
limit?
Thanks much.
Post by Scott McKenzie
Hi covici
I am not sure why the restart is taking so long, but I found a bug in the
azurebackend that was causing it to tail occasionally. Instead of
retrying
Post by Scott McKenzie
a failed upload it would cause duplicity to fail the backup. You can
find
Post by Scott McKenzie
https://code.launchpad.net/~noizyland/duplicity/fix_azurebackend_typo
Please test it and let me know if it helps.
-Scott
Post by c***@ccs.covici.com
Well, here are the first few lines from the log when I did the restart.
Since I can't see my own posts, I can't answer the question about the
traceback.
Using archive dir: /root/.cache/duplicity/backup_linux
Using backup name: backup_linux
Import of duplicity.backends.acdclibackend Succeeded
Import of duplicity.backends.azurebackend Succeeded
Import of duplicity.backends.b2backend Succeeded
Import of duplicity.backends.botobackend Succeeded
Import of duplicity.backends.cfbackend Succeeded
Import of duplicity.backends.copycombackend Succeeded
Import of duplicity.backends.dpbxbackend Failed: No module named
dropbox
Post by Scott McKenzie
Post by c***@ccs.covici.com
Import of duplicity.backends.gdocsbackend Succeeded
Import of duplicity.backends.giobackend Succeeded
Import of duplicity.backends.hsibackend Succeeded
Import of duplicity.backends.hubicbackend Succeeded
Import of duplicity.backends.imapbackend Succeeded
Import of duplicity.backends.lftpbackend Succeeded
Import of duplicity.backends.localbackend Succeeded
Import of duplicity.backends.mediafirebackend Succeeded
Import of duplicity.backends.megabackend Succeeded
Import of duplicity.backends.multibackend Succeeded
Import of duplicity.backends.ncftpbackend Succeeded
Import of duplicity.backends.onedrivebackend Succeeded
Import of duplicity.backends.par2backend Succeeded
Import of duplicity.backends.pydrivebackend Succeeded
Import of duplicity.backends.rsyncbackend Succeeded
Import of duplicity.backends.ssh_paramiko_backend Succeeded
Import of duplicity.backends.ssh_pexpect_backend Succeeded
Import of duplicity.backends.swiftbackend Succeeded
Import of duplicity.backends.sxbackend Succeeded
Import of duplicity.backends.tahoebackend Succeeded
Import of duplicity.backends.webdavbackend Succeeded
Reading globbing filelist /etc/azure_excludes.txt
Main action: inc
================================================================================
Post by Scott McKenzie
Post by c***@ccs.covici.com
duplicity 0.7.07 (April 10, 2016)
Args: /usr/bin/duplicity -v info --volsize 300 --gpg-options
--pinentry-mode loopback --exclude-filelist /etc/azure_excludes.txt
Linux ccs.covici.com 4.1.17-gentoo #1 SMP PREEMPT Mon Feb 15 15:05:32
/usr/lib/python-exec/python2.7/python2 2.7.11 (default, Jan 29 2016,
22:32:51)
[GCC 4.9.3]
================================================================================
Post by Scott McKenzie
Post by c***@ccs.covici.com
Using temporary directory /tmp/duplicity-K67URk-tempdir
Temp has 37554356224 available, backup will use approx 408944640.
Local and Remote metadata are synchronized, no sync needed.
Last full backup left a partial set, restarting.
Last full backup date: Fri May 6 16:24:16 2016
RESTART: Volumes 45955 to 45955 failed to upload before termination.
Restarting backup at volume 45955.
Deleting /tmp/duplicity-K67URk-tempdir/mktemp-gLMnDl-2
Restarting after volume 45954, file
var/www/
covici.com/htdocs-secure/owncloud/data/mattguice/files/20151114Town
Hall Meeting (Choral Prelude).mp3, block 977
A .
A audio
A audio/.mp3crc
A audio/0015.ps
And so on and so forth.
Post by Kenneth Loafman
One quick question about the format of the traceback you show... is
that
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
because of the email client you use, or does it show up that way on
the
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
console?
I'd like to see the block of data at the first of the log, say the
first
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
200 lines. Please dump in raw text format.
Post by c***@ccs.covici.com
I have thousands of files, some large and some small, its about a
terabytes worth. It stopped pretty near the end and it just
stopped
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
again, so my emphasis wuld be to fix the error and maybe do
something
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
about optimizing the program.
Post by Kenneth Loafman
Do you have a lot of very small files? This should not take too
long.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Well, what happens is that it says A and all the filenames,
but no
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
reference to a tar file, so its not really backing up again,
this
Post by Scott McKenzie
Post by c***@ccs.covici.com
is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
why
Post by Kenneth Loafman
Post by c***@ccs.covici.com
I was wondering what it is doing. The part of the program
which
Post by Scott McKenzie
Post by c***@ccs.covici.com
seems
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
"""
Fake writing to backend, but do go through all the
source
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
paths.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Stop when we have processed the last file and
block
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
from the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last backup. Normal backup will proceed
at
Post by Scott McKenzie
Post by c***@ccs.covici.com
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
start of
the
next volume in the set.
@type tarblock_iter: tarblock_iter
@param tarblock_iter: iterator for current tar block
@rtype: int
@return: constant 0 (zero)
"""
last_index = globals.restart.last_index
last_block =
globals.restart.last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# Just spin our
wheels
Post by Scott McKenzie
Post by c***@ccs.covici.com
iter_result =
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
tarblock_iter.next()
if
(tarblock_iter.previous_index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
==
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# If both the previous index and
this
Post by Scott McKenzie
Post by c***@ccs.covici.com
index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
are
Post by Kenneth Loafman
Post by c***@ccs.covici.com
done, exit now
# before we hit the next index, to
prevent
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
skipping its first
# block.
if not
last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
and
Post by Kenneth Loafman
Post by c***@ccs.covici.com
not
break
#
Only
Post by Scott McKenzie
Post by c***@ccs.covici.com
check
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
block number if
last_block is also a number
if last_block and
tarblock_iter.previous_block >
Post by Kenneth Loafman
Post by c***@ccs.covici.com
break
if
tarblock_iter.previous_index
log.Warn(_("File %s complete in
backup
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
set.\n"
"Continuing
Post by Kenneth Loafman
Post by c***@ccs.covici.com
restart
on file %s.") %
Post by Kenneth Loafman
It sounds like the backup restarted completely, rather than
at
Post by Scott McKenzie
Post by c***@ccs.covici.com
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
known spot. What do the first 50 or so lines of the log
contain?
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
...Ken
Post by c***@ccs.covici.com
But the question is why does duplicity need to do that at
all?
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Couldn't it go through the last hundred or so or some
reasonable
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
number?
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Also, what about the original error which caused the
restart
Post by Scott McKenzie
Post by c***@ccs.covici.com
in the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
first place? What is up with that that?
Post by Kenneth Loafman
No, there is no way to get rid of the loop, whichever one
you're
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
talking
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
about, there are a lot of them.
The main suggestion I have for anyone with multi-hour
backups is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar tool.
The second suggestion I have is to chunk the backup into
reasonably
Post by Kenneth Loafman
Post by c***@ccs.covici.com
sized
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
portions, 200-500GB each.
...Ken
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going
through
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
every
Post by Kenneth Loafman
Post by c***@ccs.covici.com
file
Post by Kenneth Loafman
Post by c***@ccs.covici.com
it
Post by Kenneth Loafman
Post by c***@ccs.covici.com
had and it is taking 18 hours and its not done yet. I
see
Post by Scott McKenzie
Post by c***@ccs.covici.com
a
Post by Kenneth Loafman
Post by c***@ccs.covici.com
loop
Post by Kenneth Loafman
Post by c***@ccs.covici.com
in
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
there and I wonder if there is any way to shorten the
time
Post by Scott McKenzie
Post by c***@ccs.covici.com
or
Post by Kenneth Loafman
Post by c***@ccs.covici.com
get
Post by Kenneth Loafman
Post by c***@ccs.covici.com
rid
Post by Kenneth Loafman
Post by c***@ccs.covici.com
of
Post by Kenneth Loafman
Post by c***@ccs.covici.com
the loop altoggether?
Thanks in advance for any suggestions.
--
Your life is like a penny. You're going to lose it.
The
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The
question
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question is:
How do
you spend it?

John Covici
***@ccs.covici.com
c***@ccs.covici.com
2016-05-14 12:43:36 UTC
Permalink
So, it looks like your typo change fixed one problem I was having.

Thanks.
Post by Scott McKenzie
http://bazaar.launchpad.net/~noizyland/duplicity/fix_azurebackend_typo/revision/1213#duplicity/backends/azurebackend.py
Post by c***@ccs.covici.com
How do I get this branch, I have installed this on gentoo using a tar
archiv in the download directory. Can you give me the patch, so I can
fix by hand?
Also, I wonder why an upload should fail at all since there is no space
limit?
Thanks much.
Post by Scott McKenzie
Hi covici
I am not sure why the restart is taking so long, but I found a bug in the
azurebackend that was causing it to tail occasionally. Instead of
retrying
Post by Scott McKenzie
a failed upload it would cause duplicity to fail the backup. You can
find
Post by Scott McKenzie
https://code.launchpad.net/~noizyland/duplicity/fix_azurebackend_typo
Please test it and let me know if it helps.
-Scott
Post by c***@ccs.covici.com
Well, here are the first few lines from the log when I did the restart.
Since I can't see my own posts, I can't answer the question about the
traceback.
Using archive dir: /root/.cache/duplicity/backup_linux
Using backup name: backup_linux
Import of duplicity.backends.acdclibackend Succeeded
Import of duplicity.backends.azurebackend Succeeded
Import of duplicity.backends.b2backend Succeeded
Import of duplicity.backends.botobackend Succeeded
Import of duplicity.backends.cfbackend Succeeded
Import of duplicity.backends.copycombackend Succeeded
Import of duplicity.backends.dpbxbackend Failed: No module named
dropbox
Post by Scott McKenzie
Post by c***@ccs.covici.com
Import of duplicity.backends.gdocsbackend Succeeded
Import of duplicity.backends.giobackend Succeeded
Import of duplicity.backends.hsibackend Succeeded
Import of duplicity.backends.hubicbackend Succeeded
Import of duplicity.backends.imapbackend Succeeded
Import of duplicity.backends.lftpbackend Succeeded
Import of duplicity.backends.localbackend Succeeded
Import of duplicity.backends.mediafirebackend Succeeded
Import of duplicity.backends.megabackend Succeeded
Import of duplicity.backends.multibackend Succeeded
Import of duplicity.backends.ncftpbackend Succeeded
Import of duplicity.backends.onedrivebackend Succeeded
Import of duplicity.backends.par2backend Succeeded
Import of duplicity.backends.pydrivebackend Succeeded
Import of duplicity.backends.rsyncbackend Succeeded
Import of duplicity.backends.ssh_paramiko_backend Succeeded
Import of duplicity.backends.ssh_pexpect_backend Succeeded
Import of duplicity.backends.swiftbackend Succeeded
Import of duplicity.backends.sxbackend Succeeded
Import of duplicity.backends.tahoebackend Succeeded
Import of duplicity.backends.webdavbackend Succeeded
Reading globbing filelist /etc/azure_excludes.txt
Main action: inc
================================================================================
Post by Scott McKenzie
Post by c***@ccs.covici.com
duplicity 0.7.07 (April 10, 2016)
Args: /usr/bin/duplicity -v info --volsize 300 --gpg-options
--pinentry-mode loopback --exclude-filelist /etc/azure_excludes.txt
Linux ccs.covici.com 4.1.17-gentoo #1 SMP PREEMPT Mon Feb 15 15:05:32
/usr/lib/python-exec/python2.7/python2 2.7.11 (default, Jan 29 2016,
22:32:51)
[GCC 4.9.3]
================================================================================
Post by Scott McKenzie
Post by c***@ccs.covici.com
Using temporary directory /tmp/duplicity-K67URk-tempdir
Temp has 37554356224 available, backup will use approx 408944640.
Local and Remote metadata are synchronized, no sync needed.
Last full backup left a partial set, restarting.
Last full backup date: Fri May 6 16:24:16 2016
RESTART: Volumes 45955 to 45955 failed to upload before termination.
Restarting backup at volume 45955.
Deleting /tmp/duplicity-K67URk-tempdir/mktemp-gLMnDl-2
Restarting after volume 45954, file
var/www/
covici.com/htdocs-secure/owncloud/data/mattguice/files/20151114Town
Hall Meeting (Choral Prelude).mp3, block 977
A .
A audio
A audio/.mp3crc
A audio/0015.ps
And so on and so forth.
Post by Kenneth Loafman
One quick question about the format of the traceback you show... is
that
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
because of the email client you use, or does it show up that way on
the
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
console?
I'd like to see the block of data at the first of the log, say the
first
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
200 lines. Please dump in raw text format.
Post by c***@ccs.covici.com
I have thousands of files, some large and some small, its about a
terabytes worth. It stopped pretty near the end and it just
stopped
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
again, so my emphasis wuld be to fix the error and maybe do
something
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
about optimizing the program.
Post by Kenneth Loafman
Do you have a lot of very small files? This should not take too
long.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Well, what happens is that it says A and all the filenames,
but no
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
reference to a tar file, so its not really backing up again,
this
Post by Scott McKenzie
Post by c***@ccs.covici.com
is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
why
Post by Kenneth Loafman
Post by c***@ccs.covici.com
I was wondering what it is doing. The part of the program
which
Post by Scott McKenzie
Post by c***@ccs.covici.com
seems
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
"""
Fake writing to backend, but do go through all the
source
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
paths.
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Stop when we have processed the last file and
block
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
from the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last backup. Normal backup will proceed
at
Post by Scott McKenzie
Post by c***@ccs.covici.com
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
start of
the
next volume in the set.
@type tarblock_iter: tarblock_iter
@param tarblock_iter: iterator for current tar block
@rtype: int
@return: constant 0 (zero)
"""
last_index = globals.restart.last_index
last_block =
globals.restart.last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# Just spin our
wheels
Post by Scott McKenzie
Post by c***@ccs.covici.com
iter_result =
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
tarblock_iter.next()
if
(tarblock_iter.previous_index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
==
Post by Kenneth Loafman
Post by c***@ccs.covici.com
# If both the previous index and
this
Post by Scott McKenzie
Post by c***@ccs.covici.com
index
Post by Kenneth Loafman
Post by c***@ccs.covici.com
are
Post by Kenneth Loafman
Post by c***@ccs.covici.com
done, exit now
# before we hit the next index, to
prevent
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
skipping its first
# block.
if not
last_block
Post by Kenneth Loafman
Post by c***@ccs.covici.com
and
Post by Kenneth Loafman
Post by c***@ccs.covici.com
not
break
#
Only
Post by Scott McKenzie
Post by c***@ccs.covici.com
check
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
block number if
last_block is also a number
if last_block and
tarblock_iter.previous_block >
Post by Kenneth Loafman
Post by c***@ccs.covici.com
break
if
tarblock_iter.previous_index
log.Warn(_("File %s complete in
backup
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
set.\n"
"Continuing
Post by Kenneth Loafman
Post by c***@ccs.covici.com
restart
on file %s.") %
Post by Kenneth Loafman
It sounds like the backup restarted completely, rather than
at
Post by Scott McKenzie
Post by c***@ccs.covici.com
the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
last
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
known spot. What do the first 50 or so lines of the log
contain?
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
...Ken
Post by c***@ccs.covici.com
But the question is why does duplicity need to do that at
all?
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Couldn't it go through the last hundred or so or some
reasonable
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
number?
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Also, what about the original error which caused the
restart
Post by Scott McKenzie
Post by c***@ccs.covici.com
in the
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
first place? What is up with that that?
Post by Kenneth Loafman
No, there is no way to get rid of the loop, whichever one
you're
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
talking
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
about, there are a lot of them.
The main suggestion I have for anyone with multi-hour
backups is
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
1. Use a local filesystem or USB drive for backup,
2. Transfer the backup using rsync or similar tool.
The second suggestion I have is to chunk the backup into
reasonably
Post by Kenneth Loafman
Post by c***@ccs.covici.com
sized
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
portions, 200-500GB each.
...Ken
Post by c***@ccs.covici.com
Hi. I had to restart my full backup and its going
through
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
every
Post by Kenneth Loafman
Post by c***@ccs.covici.com
file
Post by Kenneth Loafman
Post by c***@ccs.covici.com
it
Post by Kenneth Loafman
Post by c***@ccs.covici.com
had and it is taking 18 hours and its not done yet. I
see
Post by Scott McKenzie
Post by c***@ccs.covici.com
a
Post by Kenneth Loafman
Post by c***@ccs.covici.com
loop
Post by Kenneth Loafman
Post by c***@ccs.covici.com
in
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
there and I wonder if there is any way to shorten the
time
Post by Scott McKenzie
Post by c***@ccs.covici.com
or
Post by Kenneth Loafman
Post by c***@ccs.covici.com
get
Post by Kenneth Loafman
Post by c***@ccs.covici.com
rid
Post by Kenneth Loafman
Post by c***@ccs.covici.com
of
Post by Kenneth Loafman
Post by c***@ccs.covici.com
the loop altoggether?
Thanks in advance for any suggestions.
--
Your life is like a penny. You're going to lose it.
The
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The
question
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The
question
Post by Scott McKenzie
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
Post by Kenneth Loafman
Post by c***@ccs.covici.com
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
How do
you spend it?
John Covici
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
----------------------------------------------------
----------------------------------------------------
_______________________________________________
Duplicity-talk mailing list
https://lists.nongnu.org/mailman/listinfo/duplicity-talk
--
Your life is like a penny. You're going to lose it. The question is:
How do
you spend it?

John Covici
***@ccs.covici.com
Loading...