Search Results

Search found 16404 results on 657 pages for 'easy transfer'.

Page 97/657 | < Previous Page | 93 94 95 96 97 98 99 100 101 102 103 104  | Next Page >

  • I disconnected my cellphone while transferring files to its Mini SD card. Now the files aren't there

    - by Martín Fixman
    I use Ubuntu 9.10, and the MiniSD card shows as having the space used as if there were files. Baobab (the disk usage analyzer) shows that the card only has 118 MB used (of the 401 Ubuntu claims there are). Of course, I already tried the obvious (rebooting the phone, adding and removing files, etc.), but I don't want to format my card, because I still have some files on it, the transfer to my computer is slow, and because I use an old wire it fails often.

    Read the article

  • How to show declined meeting on Outlook calendar

    - by msorens
    I declined a recurring meeting invitation in Outlook 2007 to let the organizer know I am unable to attend, which will be true most of the time. But there will be rare occasions when I could attend so I would like to have this declined meeting show up on my calendar. Currently, it is sitting in my Deleted Items folder and I find no way to move/copy/transfer it to the calendar. Also, contrary to this post -- I do not even see an option to redo my accept/decline choice.

    Read the article

  • Rsync: remote source and destination

    - by goncalopp
    If both source and destination are remote, rsync complains: The source and destination cannot both be remote. rsync error: syntax or usage error (code 1) at main.c(1156) [Receiver=3.0.7] Is there a insurmountable technical obstacle to making rsync do this? Or it's simply a case of it's-not-yet-implemented? It seems relatively easy to create a local buffer in memory that mediates the transfer between two remotes, holding both hashes and data. Conversely, is there other (unix) software that implements this functionality?

    Read the article

  • Interop.Outlook.UserProperties.Add causing problem during connection time

    - by aanataliya
    Hi All, I have created a plug-in for outlook. Plug-in has only below code. private void OnNewOutlookInspector(Outlook.Inspector OutlookInsptr) { Outlook.MailItem MlItem = (Outlook.MailItem)OutlookInsptr.CurrentItem; //if I remove below line. Everything is working fine. MlItem.UserProperties.Add("INSPINIT", Outlook.OlUserPropertyType.olText , true , true ).Value = "1"; } public void OnConnection(object application, Extensibility.ext_ConnectMode connectMode, object addInInst, ref System.Array custom) { applicationObject = application; addInInstance = addInInst; MessageBox.Show("in connection new 2"); OutlkApp = (Outlook.Application)application; OutlkInsptrs = OutlkApp.Inspectors; OutlkInsptrs.NewInspector += new Outlook.InspectorsEvents_NewInspectorEventHandler(OnNewOutlookInspector); } Problem I am facing is, When I send HTML mail while plug-in is enabled, receiving end it is being received as a plain text. Below is the mail content along with the header and body at recieving end. x-sender: [email protected] x-receiver: [email protected] Received: from blr-s-07.pointcrossblr.com ([192.168.1.107]) by blr-ws-134.pointcrossblr.com with Microsoft SMTPSVC(6.0.2600.5949); Wed, 22 Dec 2010 17:11:02 +0530 Received: from blrws134 ([192.168.1.175]) by blr-s-07.pointcrossblr.com with Microsoft SMTPSVC(6.0.3790.4675); Wed, 22 Dec 2010 17:11:02 +0530 From: "Ashif Nataliya" <[email protected]> To: <[email protected]> Cc: <[email protected]> Subject: RTF FRM blr to pc.com cc blr-ws-134 Date: Wed, 22 Dec 2010 17:11:02 +0530 Message-ID: <[email protected]> MIME-Version: 1.0 Content-Type: multipart/mixed; boundary="----=_NextPart_000_00F7_01CBA1FB.36115580" X-Mailer: Microsoft Outlook 14.0 Content-Language: en-us X-MS-TNEF-Correlator: 00000000DCB2344DE8F50F4FBC91085BB5C06D55A4172000 thread-index: AcuhzRuTOBkvHPUnS1aLi9+cHNAWhA== Return-Path: [email protected] X-OriginalArrivalTime: 22 Dec 2010 11:41:02.0822 (UTC) FILETIME=[1C788860:01CBA1CD] This is a multipart message in MIME format. ------=_NextPart_000_00F7_01CBA1FB.36115580 Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: 7bit HTML Test Test Mail ------=_NextPart_000_00F7_01CBA1FB.36115580 Content-Type: application/ms-tnef; name="winmail.dat" Content-Transfer-Encoding: base64 Content-Disposition: attachment; filename="winmail.dat" // and some other code..... Any help is appreciated. Thanks.

    Read the article

  • Why is curl in Ruby slower than command-line curl?

    - by Stiivi
    I am trying to download more than 1m pages (URLs ending by a sequence ID). I have implemented kind of multi-purpose download manager with configurable number of download threads and one processing thread. The downloader downloads files in batches: curl = Curl::Easy.new batch_urls.each { |url_info| curl.url = url_info[:url] curl.perform file = File.new(url_info[:file], "wb") file << curl.body_str file.close # ... some other stuff } I have tried to download 8000 pages sample. When using the code above, I get 1000 in 2 minutes. When I write all URLs into a file and do in shell: cat list | xargs curl I gen all 8000 pages in two minutes. Thing is, I need it to have it in ruby code, because there is other monitoring and processing code. I have tried: Curl::Multi - it is somehow faster, but misses 50-90% of files (does not download them and gives no reason/code) multiple threads with Curl::Easy - around the same speed as single threaded Why is reused Curl::Easy slower than subsequent command line curl calls and how can I make it faster? Or what I am doing wrong? I would prefer to fix my download manager code than to make downloading for this case in a different way. Before this, I was calling command-line wget which I provided with a file with list of URLs. Howerver, not all errors were handled, also it was not possible to specify output file for each URL separately when using URL list. Now it seems to me that the best way would be to use multiple threads with system call to 'curl' command. But why when I can use directly Curl in Ruby? Code for the download manager is here, if it might help: Download Manager (I have played with timeouts, from not-setting it to various values, it did not seem help) Any hints appreciated.

    Read the article

  • recommendations for efficient offsite remote backup solution of vm's

    - by senorsmile
    I am looking for recommendations for backing up my current 6 vm's(and soon to grow to up to 20). Currently I am running a two node proxmox cluster(which is a debian base using kvm for virtualization with a custom web front end to administer). I have two nearly identical boxes with amd phenom II x4's and asus motherboards. Each has 4 500 GB sata2 hdd's, 1 for the os and other data for the proxmox install, and 3 using mdadm+drbd+lvm to share the 1.5 TB's of storage between the two machines. I mount lvm images to kvm for all of the virtual machines. I currently have the ability to do live transfer from one machine to the other, typically within seconds(it takes about 2 minutes on the largest vm running win2008 with m$ sql server). I am using proxmox's built-in vzdump utility to take snapshots of the vm's and store those on an external harddrive on the network. I then have jungledisk service (using rackspace) to sync the vzdump folder for remote offsite backup. This is all fine and dandy, but it's not very scalable. For one, the backups themselves can take up to a few hours every night. With jungledisk's block level incremental transfers, the sync only transfers a small portion of the data offsite, but that still takes at least a half an hour. The much better solution would of course be something that allows me to instantly take the difference of two time points (say what was written from 6am to 7am), zip it, then send that difference file to the backup server which would instantly transfer to the remote storage on rackspace. I have looked a little into zfs and it's ability to do send/receive. That coupled with a pipe of the data in bzip or something would seem perfect. However, it seems that implementing a nexenta server with zfs would essentially require at least one or two more dedicated storage servers to serve iSCSI block volumes (via zvol's???) to the proxmox servers. I would prefer to keep the setup as minimal as possible (i.e. NOT having separate storage servers) if at all possible. I have also briefly read about zumastor. It looks like it could also do what I want, but it appears to have halted development in 2008. So, zfs, zumastor or other?

    Read the article

  • Pass HAProxy healthcheck requests as User-agent "LB-Check" to the backend webservers(apache)

    - by Joseph
    I have a HAProxy setup in front of webservers(apache) for loadbalancing. Also healthchecks for these webservers are also configured in HAProxy. option httpchk HEAD /healthcheck.txt HTTP/1.0 Is it possible to transfer these healthcheck requests to backend webservers as "LB-Check" User-agent or any other option, so that I can distinguish it from other log entries? However I dont want to go for "dontlog" option, as I dont want to miss these entries.

    Read the article

  • Unlimited online backup space for fixed price using rsync/FTP/other simple protocol

    - by barrycarter
    Many companies offer unlimited online backup space for a fixed price (mozy.com, twitter.com/allmydata, onlinestoragesolution.com, etc), but they either use proprietary non-Linux-friendly software and/or have gone out of business and/or don't actually work. Who offers reliable unlimited online backup space for a fixed price that's compatible with rsync, FTP, or other generic/open source file transfer protocols? Or, has anyone written software that lets me treat Mozy's/etc space as though it were regular file space (eg, "mozyfs"?)

    Read the article

  • Cygwin offline installer?

    - by ripper234
    I hate downloading cygwin, several times I tried to download it from several different computers/networks, and many times it got stuck mid-transfer. Where can I find a reliable offline installer?

    Read the article

  • How could I import Postgres data dumps into MS SQL?

    - by dean nolan
    I have some data that is from a Postgres database dump (not csv or anything) and I am looking to get it into MS SQL. Is there an easy way to do this or a free tool that doesn't have limits on data import size etc? The Postgres is on a Debian VM and I could export it to csv in there but I am new to Linux and don't know how I would transfer it from the VM to Win 7. Thanks

    Read the article

  • Going from small to medium sized websites.

    - by Landitus
    I've been coding websites for a couple of years now, mostly in php and xhtml. I come from the design world, but I'm proud of doing standart compliant websites and great interfaces. Also used Wordpress and loved it. Most of the time there were really simple commercial websites, with no database included, where everything is done from scratch. Every page is parsed through an index?page=xxx and But I have a few prospects that are larger websites (let's call them 'medium sized websites') where I feel I'm lacking the following: How to dispach or render the pages (MVC controller instead of index?page=???) Proper page hierarchy and easy breadcrumbs implementation Auto generation of navigation menu, or an easy way to maintain them? Clean URLs Form validation Easy database support I really don't know if I should be looking into php scripts, and refine my skills or get into a CMS (like drupal) or a PHP framework. I found Wordpress very assuring and didn't feel trapped into crazy conventions, but I feel is not the right tool for this. I hate the CMS Page with the big textbox as I am used to code every page by hand my pages are not a title and a textbox. Got the feeling? My php skills are sort of medium/low still, but I would like to hear some thoughts of what I should learn to take the next step!

    Read the article

  • How to copy a 200GB file faster?

    - by RainDoctor
    I got a 200GB .tgz file on server A(RHEL 5.2). I wanna transfer that file to server B (RHEL 5.3). Server B is on ESXi 4 Update1. I gave 10GB to that Server B VM, with 4 vCPUs. Both Server A and Server B are connected with an ethernet cable with local IP addies (no switch involved) scp gives me about 3Mbps. Is there a way to get 400Mbps?

    Read the article

  • Ninety-Fifth Percentile Calculation for Bandwidth

    - by Kyle Brandt
    I am trying to calculate the bandwidth of my current internet connection. I am pulling the current input and output transfer rate via snmp. If the argument to the following function is a sorted ascending list of the the some of each input and output sample, is this the right way to calculate 95th percentile? sub ninetyFifth { #Expects Sorted Data my $ninetyFifthLine = (@_ * .95) - 1; return $_[$ninetyFifthLine]; }

    Read the article

  • Assign a drive letter to a Solaris disk in a Windows box

    - by Cat
    I need some way to map a UFS Solaris drive (ie, assign a drive letter to it) while it is in a Windows XP box. I've found utilities that will let me transfer files from a Solaris disk to a NTFS disk on the Windows box, but nothing that will let me map/share that Solaris disk. And no, putting the Solaris disk in a Solaris box and using something like Samba to share the disk is unfortunately not an option. Cat

    Read the article

  • My computer is not reading my PNY SD 1GB memory card

    - by Jessica
    I use a Kodak EasyShare C160 digital camera, a PNY SD 1GB memory card, and a Dell Latitude E5500 computer. I have had my camera for over a year and have always been able to transfer my pictures to my computer. Now my computer does not recognize my memory card and I get a message from the EasyShare software that says "Cannot get device information", although my computer does recognize the pictures stored on my camera's internal memory. Is there any way to access the pictures on my memory card, or are they lost forever?

    Read the article

  • Windows XP: saving large files on network share stalls

    - by mklhmnn
    When I transfer larger files (a few hundred MB) on a network share (either Buffalo LinkStation or other Windows machine) from my Windows XP Pro SP3, it always stalls. Smaller files are no problem, reading from a network share is also no problem. I already had this problem on my notebook and now on my desktop machine, so I assume that it most likely is no driver problem. Does anybody have a clue what could be the problem — or better: the solution?

    Read the article

  • A Digg-like rotating homepage of popular content, how to include date as a factor?

    - by Ferdy
    I am building an advanced image sharing web application. As you may expect, users can upload images and others can comments on it, vote on it, and favorite it. These events will determine the popularity of the image, which I capture in a "karma" field. Now I want to create a Digg-like homepage system, showing the most popular images. It's easy, since I already have the weighted Karma score. I just sort on that descendingly to show the 20 most valued images. The part that is missing is time. I do not want extremely popular images to always be on the homepage. I guess an easy solution is to restrict the result set to the last 24 hours. However, I'm also thinking that in order to keep the image rotation occur throughout the day, time can be some kind of variable where its offset has an influence on the image's sorting. Specific questions: Would you recommend the easy scenario (just sort for best images within 24 hours) or the more sophisticated one (use datetime offset as part of the sorting)? If you advise the latter, any help on the mathematical solution to this? Would it be best to run a scheduled service to mark images for the homepage, or would you advise a direct query (I'm using MySQL) As an extra note, the homepage should support paging and on a quiet day should include entries of days before in order to make sure it is always "filled" I'm not asking the community to build this algorithm, just looking for some advise :)

    Read the article

  • Migrate data from one server to another using rsync

    - by Leonid Shevtsov
    I'm moving from one VPS to another, and I figured that the simplest way to transfer data would be rsync. However, the data is owned by a user, www-data, which doesn't have ssh privileges, and I'd like it to be owned by the same (named) user on the target machine. Obviously I need all file permissions preserved. I have SSH access via another user with sudo privileges on both machines. Is this possible to do this with rsync?

    Read the article

  • View a pdf with quick webview though apache proxy

    - by Musa
    I have a site(IIS) that is accessed via a proxy in apache(on an IBM i). This site serves PDFs which has quick web view and if I access a pdf directly from the IIS server the PDFs starts to display immediately but if I go through the proxy I have to wait until the entire pdf downloads before I can view it. In the apache config file I use ProxyPass /path/ http://xxx.xxx.xxx.xxx/ <LocationMatch "/path/"> Header set Cache-Control "no-cache" </LocationMatch> I tried adding SetEnv proxy-sendcl to LocationMatch directive this had no effect. The PDFs that view quickly makes a lot of partial requests This is the initial request and response headers GET http://xxx.xxx.xxx.xxx/xxx.PDF HTTP/1.1 Host: xxx.xxx.xxx.xxx Proxy-Connection: keep-alive Cache-Control: no-cache Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8 Pragma: no-cache User-Agent: Mozilla/5.0 (Windows NT 6.2; rv:9.0.1) Gecko/20100101 Firefox/9.0.1 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8 Cookie: chocolatechip HTTP/1.1 200 OK Via: 1.1 xxxxxxxx Connection: Keep-Alive Proxy-Connection: Keep-Alive Content-Length: 15330238 Date: Mon, 25 Aug 2014 12:48:31 GMT Content-Type: application/pdf ETag: "b6262940bbecf1:0" Server: Microsoft-IIS/7.5 Last-Modified: Fri, 22 Aug 2014 13:16:14 GMT Accept-Ranges: bytes X-Powered-By: ASP.NET This is a partial request and response GET http://xxx.xxx.xxx.xxx/xxx.PDF HTTP/1.1 Host: xxx.xxx.xxx.xxx Proxy-Connection: keep-alive Cache-Control: no-cache Pragma: no-cache User-Agent: Mozilla/5.0 (Windows NT 6.2; rv:9.0.1) Gecko/20100101 Firefox/9.0.1 Accept: */* Referer: http://xxx.xxx.xxx.xxx/xxxx.PDF Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8 Cookie: chocolatechip Range: bytes=0-32767 HTTP/1.1 206 Partial Content Via: 1.1 xxxxxxxx Connection: Keep-Alive Proxy-Connection: Keep-Alive Content-Length: 32768 Date: Mon, 25 Aug 2014 12:48:31 GMT Content-Range: bytes 0-32767/15330238 Content-Type: application/pdf ETag: "b6262940bbecf1:0" Server: Microsoft-IIS/7.5 Last-Modified: Fri, 22 Aug 2014 13:16:14 GMT Accept-Ranges: bytes X-Powered-By: ASP.NET These are the headers I get if I go through he proxy GET /path/xxx.PDF HTTP/1.1 Host: domain:xxxx Connection: keep-alive Cache-Control: no-cache Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8 Pragma: no-cache User-Agent: Mozilla/5.0 (Windows NT 6.2; rv:9.0.1) Gecko/20100101 Firefox/9.0.1 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8 HTTP/1.1 200 OK Date: Mon, 25 Aug 2014 13:28:42 GMT Server: Microsoft-IIS/7.5 Content-Type: application/pdf Last-Modified: Fri, 22 Aug 2014 13:16:14 GMT Accept-Ranges: bytes ETag: "b6262940bbecf1:0"-gzip X-Powered-By: ASP.NET Cache-Control: no-cache Expires: Thu, 24 Aug 2017 13:28:42 GMT Vary: Accept-Encoding Content-Encoding: gzip Keep-Alive: timeout=300, max=100 Connection: Keep-Alive Transfer-Encoding: chunked I'm guessing its because the proxy uses Transfer-Encoding: chunked but I'm not sure and wasn't able to turn it off to check. Browser Chrome 36.0.1985.143 m Using the native PDF viewer Any help to get the pdf quick web view through the proxy working would be appreciated.

    Read the article

  • Restoring a SD card after using it to root a device

    - by Raz
    I recently purchased the Nook Touch reading device. I rooted it using the instructions on nookdevs.com Rooting included using win32diskimager to somehow transfer an .img file onto the card. Rooting is now completed and as far as I can tell, I don't need to keep the files used on the SD card, which has been reduced from a 3.5 GB or so unit to a 75 MB unit. Is there a simple way for me to somehow reformat the SD card to its original state or at least 3.5 GB. I cannot simply format it.

    Read the article

  • What technologies allow bidirectional streaming of video?

    - by Roman
    Wikipedia says that Flash allows "bidirectional streaming of audio and video". Is it possible to do that with other technologies (for example with JavaScript)? In other words, I want to transfer video from one user of web-site to another one in real time. I want to have something that is already installed by many users or easy to install (Flash fulfills this requirements). And I want to have something free.

    Read the article

  • How can Rackspace beat DigitalOcean's Pricepoint? [on hold]

    - by Matt Jensen
    I have recently discovered Digital Ocean and I have found it to be a relatively nice experience for small staging servers, and the thought occurred to me, why am I paying $267~ for a server on Rackspace (40GB RAM, 160 GB Drive, 2 vCPUs, 400 Mb/s) when Digital Ocean offers a server for $40 (40 GB RAM, 60GB Drive [storage is not a concern of mine], 2 vCPUs, ?Mb/s)? Does Rackspace offer some kind of obvious advantages in Transfer speed/bandwidth? My applications are small startups that for the immediate future will only have about 200-300 concurrent users at once.

    Read the article

< Previous Page | 93 94 95 96 97 98 99 100 101 102 103 104  | Next Page >