Search Results

Search found 263 results on 11 pages for 'archiving'.

Page 6/11 | < Previous Page | 2 3 4 5 6 7 8 9 10 11  | Next Page >

  • Gmail undo yellow notification area disappears too rapidly on Chrome

    - by stephf0716
    I'm having trouble with the Gmail yellow notification area disappearing too quickly in Google Chrome. For reference, I am talking about the notification that appears at the top of Gmail after archiving or deleting a message from a web browser. I tried it on IE9 and it works fine. I have also cleared the cache and cookies on Chrome and the issue persists. Has anybody run into this on Chrome and know of a solution?

    Read the article

  • Small Business Solution

    - by user30393
    Looking for ideas / thoughts on a small office setups. Users : 25 Remote users ; 5 Remote office : 3 I'm a big fan of small business server but looking for mail archiving and NAS storage solutions to separate user data from AD and email. Look forward to your thoughts, setups. Anyone with hosted solutions experience would also be nice. Thanks

    Read the article

  • Organization: Ways to link/group documents with emails?

    - by Scott Smith
    I like keeping my stuff organized, but short of printing everything out and keeping it in an actual file cabinet, I've never figured out a good way to link/group document files with related emails. This means that when I'm looking for something, I often have to search in my email program, and then through the documents stored in some filesystem folder. Has anyone out there come up with a neat way to group related stuff like this for searching, archiving, etc?

    Read the article

  • How to avoid compressing compressed files

    - by Gzorg
    Most compression programs compress all files by default. But when archiving a folder containing already compressed files, there is no need to compress them a second time, such as archives, packed setup program, jpg, movies, mp3,.... Are there any compression programs that allow an arbitrary list of type of files to be stored while the others are still compressed ? It looks like Winrar can't. I expect this would be doable with tar+gz/bzip2 and some scripting in various ways. Edit : Winrar can

    Read the article

  • RAR Command Line Maximum Compression

    - by Steve Robathan
    I am writing a batch file to compress a folder using various archiving applications. Currently I also use Winrar (X64) but manually set up the parameters I would like to add rar to my batch The folder concerned has many sub folders and I need to take this into account What is the command line for the following keeping folder structure?: Solid Archive, Archive Format=RAR5, Compression Method=Best, Dictionary Size=1024MB Many thanks

    Read the article

  • Experiences with eXdupe?

    - by ewwhite
    I noticed that the eXdupe compression/archiving/deduplication utility was recently mentioned in another post here. It boasts some interesting features, and I've been playing with it for the past day. It's basically a cross-platform, highly multithreaded archival tool. http://exdupe.com/index.html I'm curious if anyone here uses it in production or has any tips on how to leverage the tool in their environment. I'm looking for suggestions.

    Read the article

  • Make thunderbird store all mail locally for IMAP accounts but not indexing all for search

    - by rubo77
    In Thunderbird the global gloda search is connected to the selection of downloaded/syncronized folders of the IMAP accounts in the Offline-settings. Is it possible somehow, that Thunderbirds download/syncs all emails in the IMAP account but does not add them to the index for the global search? I would like to do this because I have some accounts that I only keep in thunderbird for archiving reasons but I don't want to find those mails, when I use the global search

    Read the article

  • 7-Zip Command Line Maximum Compression

    - by Steve Robathan
    I am writing a batch file to compress a folder using various archiving applications. Currently I also use 7-Zip but manually set up the parameters I would like to add 7-zip to my batch The folder concerned has many sub folders and I need to take this into account What is the command line for the following keeping folder structure?: Archive Format=7z Compression Level=Ultra Compression Method=LZMA Dictionary Size=512MB Word Size=273 Solid Archive Many thanks

    Read the article

  • Retail Link data storage requirements

    - by Randy Walker
    I was asked today about how much data an average Retail Link analyst (Walmart vendor) would consume.  I thought I would write this small post for future reference. Of course this vastly depends on the amount of skus, how long you want to archive data, and if you want store level sales. Most reports take up very little space. Most times when you download a report (total sales per sku for last week), you will overwrite the previous week’s report.  However, most users will take the data inside their downloaded report, and add it to a database or larger excel spreadsheet.  This way, the user has a history of the sales of each item/sku per week over the last 2+ years.  I would estimate 1 user to consume around 1-2 gb of space, at most, over the course of 2 years. If you start archiving store level sales those numbers can drastically increase up to 10gb or more very quickly.

    Read the article

  • Why is my content database so large?

    - by PeterBrunone
    If your SharePoint site collection hasn't grown, but your content database has, the most likely culprit is versioning.  If a list -- or worse, a library -- has versioning enabled, the default is to keep every single one.  That means that every time someone edits and checks in a document, its storage footprint increases by the size of the document (and probably a little more).The solution?  It could be a bit painful, but you'll need to go back into each library and restrict the number of versions to keep (three is sufficient for most uses, but your needs may vary).  I suggest keeping only major versions as well, since minor versions are really just stopping points on the way to a published document.Of course if you have a real business need to keep all those versions around, then you'll want to look into an archiving solution that will take the old versions out of the content database but still make them available if necessary.

    Read the article

  • Google Site Search (commercial) not indexing files in sitemap

    - by melat0nin
    I have a client for whom we have purchased Google Site Search. It works well for HTML pages served by the CMS, but files aren't being reliably indexed. I wrote a script to generate an XML feed (sitemap) of all the files in the CMS which I've plugged in to Google Webmaster Tools for the site. It says that for that sitemap 923 URLs have been submitted, but only 26 have been indexed. The client relies heavily on searching within files, which is why we decided to use Google search, so this is a bit of a problem. Many of the files aren't linked to from any page on the site, as they are old and therefore don't merit having a page of their own. But they still need to be accessible through search for archiving purposes. The file archive xml can be found at www.sniffer.org.uk/file-archive and the standard xml sitemap (of pages) can be found at www.sniffer.org.uk/sitemap.xml. Any thought would be much appreciated!

    Read the article

  • The Emergence of a New Architecture for Long-term Data Retention

    - by Claudia Caramelli-Oracle
    Dear Partner, A new research report from Wikibon explains how the combination of flash and tape makes for a superior solution for long-term data archives versus using dedupe appliances. The combination of these two technologies, that have been in the market, one for a few years and the other for decades, introduces a new concept. The concept is “Flape”, a concept first coined by Wikibon in October of 2012. Flape is a combination of Flash (SSD) technology and tape…this combination of technologies when used for long-term archiving can save IT departments as much as 300% of their overall IT budget over the course of 10 years. Do you want to know more? You can review the whole report here.

    Read the article

  • JOB OF THE WEEK

    - by Tim Koekkoek
    Placement in Contract and Business Practice Services department (50%) - Baden (Switzerland) This placement in the Contract and Business Practice Services department is challenging and diverse and you will support and contribute to the contract teams with the creation and technical archiving of the documents. All duties are in close coordination with the account management and several contracts team, so you will need to have great communication skills both in German and English, great organizational skills and the flexibility to deal with different stakeholders.  You will be working in a very international organization and get the possibility to work out your own ideas and develop your skills and your career in one of the biggest Technology companies in the world! If you are interested in this position, read more here!For all of our other vacancies and internships, please visit https://campus.oracle.com.

    Read the article

  • jQuery: Gmail Star?

    - by st4ck0v3rfl0w
    Hi All, I was wondering if anyone had any good tutorials around creating a gmail inbox star (favorite) ? EDIT: I guess I want to create something just like the stackoverflow star or gmail inbox star. I have a set of list items that i've added several controls to. One being a checkbox (for archiving, or batch deleting) and another I have a placeholder checkbox for favoriting. I'm just curious as to what the best approach is for making it snazzy with jquery.

    Read the article

  • Outlook email address autocompletion slow

    - by user214984
    Hello, I'm using outlook 2007. Two days ago, we've enabled auto-archiving, maybe that't the source of our problems, but I still have to investigate. At any rate, auto completion of email addresses in the to/cc/bcc fields has become a stopper, taking up to a minute to do something. I've searched the web, but found only references to problems with McAfee which I don't have installed (http://www.groovypost.com/howto/microsoft/outlook/fix-slow-outlook-email-address-auto-complete/) Thanks Holger

    Read the article

  • Hiding a deprecated sharepoint web

    - by BeraCim
    Hi all: I want to hide a Sharepoint web that has been deprecated (via custom means) due to the release of a newer version, whether it would be making it invisible in the sites and workspaces, or via some special archiving function provided by Sharepoint. Basically I do not wish the users to be able to see the deprecated site. I was wondering what are the options for doing so, both programmatically or via Sharepoint utils/interfaces? Thanks.

    Read the article

  • How to monitor a POP, SMTP and Exchange Server for mail activity

    - by Gerhard
    We need to write a .Net (C#) application that monitors all mail activity through a POP, SMTP and Exchange Server (2007 and later) and essentially grab the mail for archiving into a document management system. I realise that the way to monitor each type of server would probably be different so I'd like to know what the best (most elegant and reliable) way is to achieve this. Thanks.

    Read the article

  • Does NSKeyedUnarchiver autorelease?

    - by Lee Probert
    I'm doing some archiving to a property list and when I unarchive my data using NSKeyedUnarchiver I find that my app crashes if I release the object afterward. I was wondering if the finishDecoding message also autoreleases the object. Seems weird that it crashes when I release it.

    Read the article

  • SQL Server architecture guidance

    - by Liam
    Hi, We are designing a new version of our existing product on a new schema. Its an internal web application with possibly 100 concurrent users (max)This will run on a SQL Server 2008 database. On of the discussion items recently is whether we should have a single database of split the database for performance reasons across 2 separate databases. The database could grow anywhere from 50-100GB over 5 years. We are Developers and not DBAs so it would be nice to get some general guidance. [I know the answer is not simple as it depends on the schema, archiving policy, amount of data etc. ] Option 1 Single Main Database [This is my preferred option]. The plan would be to have all the tables in a single database and possibly to use file groups and partitioning to separate the data if required across multiple disks. [Use schema if appropriate]. This should deal with the performance concerns One of the comments wrt this was that the a single server instance would still be processing this data so there would still be a processing bottle neck. For reporting we could have a separate reporting DB but this is still being discussed. Option 2 Split the database into 2 separate databases DB1 - Customers, Accounts, Customer resources etc DB2 - This would contain the bulk of the data [i.e. Vehicle tracking data, financial transaction tables etc]. These tables would typically contain a lot of data. [It could reside on a separate server if required] This plan would involve keeping the main data in a smaller database [DB1] and retaining the [mainly] read only transaction type data in a separate DB [DB2]. The UI would mainly read from DB1 and thus be more responsive. [I'm aware that this option makes it harder for Referential Integrity to be enforced.] Points for consideration As we are at the design stage we can at least make proper use of indexes to deal performance issues so thats why option 1 to me is attractive and its more of a standard approach. For both options we are considering implementing an archiving database. Apologies for the long Question. In summary the question is 1 DB or 2? Thanks in advance, Liam

    Read the article

  • Decide which caching startegy to use ?

    - by hib
    Hi all, I want to cache my loaded data so that I can reduce my application start time . I know several strategies to store application data i.e. core data, nsuserdefaults , archiving . Now my scenario is that suppose that I have array of maximum 10 objects each object having 5 fields . So I can not decide which strategy to store this array an later retrieving the same . Thanks .

    Read the article

  • How to automate org-refile for multiple todo

    - by lawlist
    I'm looking to automate org-refile so that it will find all of the matches and re-file them to a specific location (but not archive). I found a fully automated method of archiving multiple todo, and I am hopeful to find or create (with some help) something similar to this awesome function (but for a different heading / location other than archiving): https://github.com/tonyday567/jwiegley-dot-emacs/blob/master/dot-org.el (defun org-archive-done-tasks () (interactive) (save-excursion (goto-char (point-min)) (while (re-search-forward "\* \\(None\\|Someday\\) " nil t) (if (save-restriction (save-excursion (org-narrow-to-subtree) (search-forward ":LOGBOOK:" nil t))) (forward-line) (org-archive-subtree) (goto-char (line-beginning-position)))))) I also found this (written by aculich), which is a step in the right direction, but still requires repeating the function manually: http://stackoverflow.com/questions/7509463/how-to-move-a-subtree-to-another-subtree-in-org-mode-emacs ;; I also wanted a way for org-refile to refile easily to a subtree, so I wrote some code and generalized it so that it will set an arbitrary immediate target anywhere (not just in the same file). ;; Basic usage is to move somewhere in Tree B and type C-c C-x C-m to mark the target for refiling, then move to the entry in Tree A that you want to refile and type C-c C-w which will immediately refile into the target location you set in Tree B without prompting you, unless you called org-refile-immediate-target with a prefix arg C-u C-c C-x C-m. ;; Note that if you press C-c C-w in rapid succession to refile multiple entries it will preserve the order of your entries even if org-reverse-note-order is set to t, but you can turn it off to respect the setting of org-reverse-note-order with a double prefix arg C-u C-u C-c C-x C-m. (defvar org-refile-immediate nil "Refile immediately using `org-refile-immediate-target' instead of prompting.") (make-local-variable 'org-refile-immediate) (defvar org-refile-immediate-preserve-order t "If last command was also `org-refile' then preserve ordering.") (make-local-variable 'org-refile-immediate-preserve-order) (defvar org-refile-immediate-target nil) "Value uses the same format as an item in `org-refile-targets'." (make-local-variable 'org-refile-immediate-target) (defadvice org-refile (around org-immediate activate) (if (not org-refile-immediate) ad-do-it ;; if last command was `org-refile' then preserve ordering (let ((org-reverse-note-order (if (and org-refile-immediate-preserve-order (eq last-command 'org-refile)) nil org-reverse-note-order))) (ad-set-arg 2 (assoc org-refile-immediate-target (org-refile-get-targets))) (prog1 ad-do-it (setq this-command 'org-refile))))) (defadvice org-refile-cache-clear (after org-refile-history-clear activate) (setq org-refile-targets (default-value 'org-refile-targets)) (setq org-refile-immediate nil) (setq org-refile-immediate-target nil) (setq org-refile-history nil)) ;;;###autoload (defun org-refile-immediate-target (&optional arg) "Set current entry as `org-refile' target. Non-nil turns off `org-refile-immediate', otherwise `org-refile' will immediately refile without prompting for target using most recent entry in `org-refile-targets' that matches `org-refile-immediate-target' as the default." (interactive "P") (if (equal arg '(16)) (progn (setq org-refile-immediate-preserve-order (not org-refile-immediate-preserve-order)) (message "Order preserving is turned: %s" (if org-refile-immediate-preserve-order "on" "off"))) (setq org-refile-immediate (unless arg t)) (make-local-variable 'org-refile-targets) (let* ((components (org-heading-components)) (level (first components)) (heading (nth 4 components)) (string (substring-no-properties heading))) (add-to-list 'org-refile-targets (append (list (buffer-file-name)) (cons :regexp (format "^%s %s$" (make-string level ?*) string)))) (setq org-refile-immediate-target heading)))) (define-key org-mode-map "\C-c\C-x\C-m" 'org-refile-immediate-target) It sure would be helpful if aculich, or some other maven, could please create a variable similar to (setq org-archive-location "~/0.todo.org::* Archived Tasks") so users can specify the file and heading, which is already a part of the org-archive-subtree functionality. I'm doing a search and mark because I don't have the wherewithal to create something like org-archive-location for this setup. EDIT: One step closer -- almost home free . . . (defun lawlist-auto-refile () (interactive) (beginning-of-buffer) (re-search-forward "\* UNDATED") (org-refile-immediate-target) ;; cursor must be on a heading to work. (save-excursion (re-search-backward "\* UNDATED") ;; must be written in such a way so that sub-entries of * UNDATED are not searched; or else infinity loop. (while (re-search-backward "\* \\(None\\|Someday\\) " nil t) (org-refile) ) ) )

    Read the article

  • Unable to remove limit on memory usage for PHP script.

    - by Jess Telford
    The Situation I am having an issue with a PHP script getting the following error message: Fatal error: Out of memory (allocated 359923712) (tried to allocate 72 bytes) in /path/to/piwik/core/DataTable.php on line 969 The script I'm running is: /path/to/piwik/misc/cron/archive.sh I am assuming the numbers are Bytes, which means that total is approximately 360MB. For all intents and purposes, I have increased the memory limits on the server well above 360MB, yet this is the number (give or take a byte) it consistently errors out at. Please note: This question is not about fixing a memory leak in the script, nor about why the script itself is using so much memory. The script is part of the Piwik archiving process, so I cannot just fix any memory leaks, etc. For more info on this script and why I am increasing the memory limit, see "How to setup auto archiving" The question Given that the script is attempting to use over 360MB of memory, which I cannot change, why does it not seem possible for me to increase the amount of memory available to php on my server? What I've tried Increasing PHP's memory_limit Given the php.ini file: php -i | grep php.ini Configuration File (php.ini) Path => /usr/local/lib Loaded Configuration File => /usr/local/lib/php.ini I have edited that file, so the memory_limit directive reads; memory_limit = -1 Restart Apache, and check the new value has stuck; $ php -i | grep memory_limit memory_limit => -1 => -1 Run the script, and get the same error. I've also tried 1G, 768M, etc, all to the same result (ie; no change). Update 22nd June: Based on Vangel's help, I have attempted to set post_max_size to 20M in combination with setting memory_limit. Again, this has no effect. Removing the memory limit on child processes of Apache I have found and edited the httpd.conf file to make sure there is no RLimitMEM directive. I then used WHM's Apache Configuration Memory Usage Restrictions to generate a restriction, which it claimed was at 1000M (and confirmed by checking httpd.conf). Both of these resulted in no change to the script erroring at 360MB. Increasing the per process memory limits of Linux The current limits set on the system: $ ulimit -m 524288 $ ulimit -v 524288 I have attempted to set both of these to unlimited: $ ulimit -m unlimited $ ulimit -v unlimited $ ulimit -m unlimited $ ulimit -v unlimited Once again, this has resulted in absolutely no improvement in my problem. My setup $ cat /etc/redhat-release CentOS release 5.5 (Final) $ uname -a Linux example.com 2.6.18-164.15.1.el5 #1 SMP Wed Mar 17 11:30:06 EDT 2010 x86_64 x86_64 x86_64 GNU/Linux $ php -i | grep "PHP Version" PHP Version => 5.2.9 $ httpd -V Server version: Apache/2.0.63 Server built: Feb 2 2011 01:25:12 Cpanel::Easy::Apache v3.2.0 rev5291 Server's Module Magic Number: 20020903:13 Server loaded: APR 0.9.17, APR-UTIL 0.9.15 Compiled using: APR 0.9.17, APR-UTIL 0.9.15 Architecture: 64-bit Server compiled with.... -D APACHE_MPM_DIR="server/mpm/prefork" -D APR_HAS_SENDFILE -D APR_HAS_MMAP -D APR_HAVE_IPV6 (IPv4-mapped addresses enabled) -D APR_USE_SYSVSEM_SERIALIZE -D APR_USE_PTHREAD_SERIALIZE -D SINGLE_LISTEN_UNSERIALIZED_ACCEPT -D APR_HAS_OTHER_CHILD -D AP_HAVE_RELIABLE_PIPED_LOGS -D HTTPD_ROOT="/usr/local/apache" -D SUEXEC_BIN="/usr/local/apache/bin/suexec" -D DEFAULT_PIDLOG="logs/httpd.pid" -D DEFAULT_SCOREBOARD="logs/apache_runtime_status" -D DEFAULT_LOCKFILE="logs/accept.lock" -D DEFAULT_ERRORLOG="logs/error_log" -D AP_TYPES_CONFIG_FILE="conf/mime.types" -D SERVER_CONFIG_FILE="conf/httpd.conf" Output of $ php -i: http://pastebin.com/EiRut6Nm

    Read the article

  • Zantaz's EAS exchange archive product doesn't integrate with Outlook in Windows 7

    - by Chris Farmer
    I work at a place that uses Exchange with Outlook for mail, and they also use a product from Zantaz called "EAS" which does server-side archiving of old messages. One of the artifacts of this archival is that email attachments are missing from the archived messages when viewed in Outlook. EAS has a client tool that plugs into Outlook that enables easy retrieval of those archived messages and their attachments, but it doesn't seem to work when installed on Windows 7. I have no direct evidence of this, other than that it simply doesn't work on my Windows 7 machine, and some of our network support staff seem to corroborate this. The symptom is that Outlook seems to know nothing of the existence of this EAS app. The EAS app also has a system tray icon which is there and offers some minimal functionality, but the real goodness is the Outlook integration, which I sorely miss. So, my question is this: does anyone here know whether it's possible to coerce this EAS product into working correctly in Windows 7?

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11  | Next Page >