Search Results

Search found 67192 results on 2688 pages for 'excel external data'.

Page 332/2688 | < Previous Page | 328 329 330 331 332 333 334 335 336 337 338 339  | Next Page >

  • How can I read a reel-to-reel tape from the 1970s?

    - by Joe Wreschnig
    A close friend of my mother worked at DEC in the 1970s and 1980s. She recently passed away, and in sorting through her estate, my mother discovered some reel-to-reel magnetic tape. We are curious about what might be on it. I haven't yet seen a picture of it, but Wikipedia tells me this is most likely DECtape. Is there any chance the data on it is still good? It was not preserved with great care, but as far as we know it has also never been particularly abused. Just left in a box and moved a few times. If the data is still valid, do we need to dig up a PDP or VAX or read it, or is there a more modern option?

    Read the article

  • IIS FTP service - download timeouts and restarts getting the data twice

    - by accel229
    We have an IIS FTP site on a Windows Server 2003 x64 machine. Application Layer Gateway service is disabled (so http://support.microsoft.com/kb/931130 does not apply). Windows Firewall service is disabled as well. Connection timeout for the FTP site (there is only one) is set to 1,200 seconds = 20 minutes. An external client can connect to the site, list directory contents and download small files. When a client attempts to download a large file (eg, if the download continues for 3 minutes, which is still under 20 minutes, but relatively long), the server sends all data, then the connection times out, the client issues REST / RETR commands attempting to restart the download since after the last byte (which I believe should succeed and receive exactly 0 bytes), and the server behaves as if the client tried to restart after byte 0, that is, it sends the entire file all over. Any ideas on how to fix this?

    Read the article

  • Is it dangerous to use both Sky Drive and Dropbox?

    - by Matthew
    I'd like to experiment with Sky Drive, but keep using my Dropbox account unless I decide to switch. This answer gives instructions for how to set up both at the same time, but I'm a little worried about data integrity. Is there any danger involved here? Will Sky Drive and Dropbox fight each other? Note that I am using Sky Drive/Dropbox on multiple computers, so they will be writing data as well as reading it. Is this safe? Edit: I can use them with different folders if necessary, but I'm particularly curious what would happen if they sync from the same folder.

    Read the article

  • Fast, reliable data transfers from/to China

    - by Nils
    We are a small company and we will need to transfer rather large amounts of data (10GB+ each time) between Europe and China in the near future. As many may have experienced, Internet connections to or from China can be rather unreliable and slow at times without any apparent reason. For example, while sending data to China via FTP generally works well, it can be painfully slow in the other direction. Currently, we are investigating new ways to have high transfer rates in both directions. So far we have tried: FTP (see above) FTP over VPN services (generally slower than direct connections) F2F (like Retroshare or Freenet - slow!!) Aspera (fast but expensive!) BitTorrent (unreachable end nodes, b/c of firewalls which we must not configure) We would like to try: Cloud storage (e.g. Amazon S3, Google Storage) - are those services always and reliably reachable from inside China? Point-to-Point VPN (currently not possible, b/c of the network, see above) I'd be especially grateful to hear from people who have already dealt with this kind of problem before.

    Read the article

  • ext4: error loading journal

    - by cloudyOutside
    I have an external hard drive with two partitions: A small FAT32 which is mostly empty and works fine and a large ext4 with tons of data, most of which isn't backed up. The ext4 is visible, but can't be mounted. I get an "error loading journal" error. The drive is a Western Digital Caviar Blue 500GB. Roughly 30GB of that is FAT32 and the rest is the ext4. The light on the enclosure turns red when reading from the bad partition. It was made by Cavalry. There wasn't any warning, but coincidentally, I've been thinking lately that I should get two large capacity drives for real backups. Is there anything that can be done? I'm not even sure I have enough storage to backup everything even if it is redeemable.

    Read the article

  • Shared volume for data (multiple MDF) and another shared volume for logs (multiple LDF)

    - by hagensoft
    I have 3 instances of SQL Server 2008, each on different machines with multiple databases on each instance. I have 2 separate LUNS on my SAN for MDF and LDF files. The NDX and TempDB files run on the local drive on each machine. Is it O.K. for the 3 instances to share a same volume for the data files and another volume for the log files? I don't have thin provisioning on the SAN so I would like to not constaint disk space creating multiple volumes because I was adviced that I should create a volume (drive letter) for each instance, if not for each database. I am aware that I should split my logs and data files at least. No instance would share the actual database files, just the space on drive. Any help is appretiated.

    Read the article

  • Making a hidden truecrypt volume with existing data

    - by Bill Grey
    I have a 1TB hdd, which I would like to encrypt. I would like to make a hidden volume, with almost nothing within but some decoy data, and the rest in a hidden volume. However, my driver is over 95% full. Is it still possible to do this, or would it have to be done on an empty drive, and then copy the data over? I could not find the answer to this question in the documentation. Also, how easy would it be to undo, or unencrypt the drive? Would it again need another empty drive to begin with?

    Read the article

  • SFTP: How to keep data out of the DMZ

    - by ChronoFish
    We are investigating solutions to the following problem: We have external (Internet) users who need access to sensitive information. We could offer it to them via SFTP which would offer a secure transport method. However, we don't want to maintain the data on server as it would then reside in the DMZ. Is there an SFTP server that has "copy on access" such that if the box in the DMZ were to be compromised, no actual data resided on that box? I am envisioning an SFTP Proxy or SFTP passthrough. Does such a product exist currently?

    Read the article

  • Transparently decompressing data in archive to allow greater compression later

    - by Vi.
    I have, for example, filesystem image which have some compressed files (with weak compression such as gzip), for example, manpages or archives with the same uncompressed content nearby. How to pre-filter the data to "expand" compressed data to plain form (to re-compress it with strong compression) and then post-filter after decompression to restore original "semi-compressed" image? SHA-1 match is advices but not strictly required (but the resulting image must work, e.g. re-compressed files should not grow too much, be decompressible etc.) Like improving compression ratio by reversing weak compression algorithms. Are there programs for this?

    Read the article

  • how to warehouse data that is not needed from sql server

    - by I__
    I have been asked to truncate a large table in sql server 2008. The data is not needed but might be needed once every two years. It will NEVER have to be changed, only viewed. The question is, since I don't need the data on a day-to-day basis, what do I do with it to protect and back it up? Please keep in mind that I will need to have it accessible maybe once every two years, and it is FINE for us if the recovery process takes a few hours. The entire table is about 3 million rows and I need to truncate it to about 1 million rows.

    Read the article

  • Do any filesystems support multiple forks / streams on directories?

    - by hippietrail
    Apple's HFS+ supports multiple forks such as the old data and resource forks. NTFS supports alternate data streams. I believe some *nix filesystems also have some support for multiple file forks or streams. Given that directories (folders) are just a kind of file at the filesystem level, I'm wondering if any of the filesystems which support this feature support it for dirs as well as files? (Or indeed directories in the alternate forks / streams?) I'm mostly asking out of curiosity rather than wanting to use such a feature. But one use it would have would be additional metadata for directories, which seems to be the most common use for these streams for files currently.

    Read the article

  • Combine OS partion with data partition on NAS4Free/FreeNAS

    - by Pak
    I recently built a NAS4Free (formerly FreeNAS) machine using a 256MB (yes, MB) USB drive for the OS. When I did the original install, I had the bright idea of making the OS partition just big enough for the OS and a then creating a second partition using the remainder of the drive to store stuff pertaining to the OS. I never really found a use for the data partition and I ended up running out of space on the OS partition, so now I'd like to combine the partitions into a single partition. Is this something that is possible to do while everything is up and running? If it comes down to it, I can take down the machine and do a fresh install of the OS using the entire space of the USB drive, but I'd like to use this as an opportunity to better familiarize myself with FreeBSD/UNIX type systems. If this is possible, will it interfere with the NAS4Free things? The data partition shows up in the web interface under the disks section. If I end up manually changing the partitions, I'd be concerned with NAS4Free getting confused by the missing partition.

    Read the article

  • Why is Chrome receiving data?

    - by Aero
    Chrome seems to be continually receiving data even though I'm not downloading anything. This is making a noticeable impact on my browsing speed. The first screenshot shows Chrome receiving data even though I'm not downloading anything (nor buffering a YouTube video etc.) Even after I completely close Google Chrome, the "chrome.exe" remains in the Resource Monitor list and the "Received bytes" column continually increases in the screenshot below. However, "chrome.exe" does not show up in the Processes tab of Task Manager. This only occurs sometimes, but I don't know why. I have tried running a malware/virus scans to ensure that there is nothing malicious behind this, but those scans have shown nothing. Any ideas on what's causing this?

    Read the article

  • Is there a program that will show a tree of the differences in two file trees?

    - by Huckle
    In windows I manually back up from time to time by formatting my external drive and copying the contents of my data partition over. Inevitably there is a difference in the number and size of the files copied because of system files, etc. Is there a program that would diff two directories recursively and compile the differences into a nice GUI tree that I could peruse (preferably filter) to ensure that everything I want made it over to the drive? It should only show files that are not in both directories. (Also, please ignore the inadequacy of my backup solution)

    Read the article

  • scrape data from a website and post it on the blog (wordpress)

    - by Pennf0lio
    This could be in DocType But I'm looking for a software or just a plugin for wordpress. I wanted to fetch those data from a website and automatically post it on my blog (Wordpress powered). It doesn't have rss or api to get those data, so I need to manually copy and paste it one-by-one and post it on wordpress. Do you know an alternative options on my process? or you know a software or a plugin that does the job? Thanks!

    Read the article

  • Backing up data (including mysqldumps) to S3

    - by seengee
    We have a web app on a number of servers and we want to add an additional layer of redundancy by backing up the key data to S3. The key data is the MySQL database and a folder containing dynamically created site assets - predominantly images. Some kind of rsync based solution would initially seem the best plan. A couple of years ago we played with S3cmd (in particular s3cmd sync) with some success but we didn't find it particularly reliable although this may have changed since. Its occurred to me though that a rsync solution might not work particularly well with a single db.sql file created with mysqldump and I assume this means the whole database getting transferred each time, with multiple databases of over 1GB this is going to add up to a lot of traffic (and $s) very quickly. With the image files I could simply just transfer files modified within the last day which would be far more simple. What approach should I look at?

    Read the article

  • Windows XP - non-user input data filter message after installing wireless keyboard & mouse

    - by James
    After I installed MS wireless keyboard and mouse and associated software, I started getting this annoying message titled "Hardware installation" telling me the software I am trying to install did not pass the XP logo tests. The software is for "HID non-user input data filter" and I have two options Continue anyway or stop installation. Now, if I try to continue the installation fails, if stop installing another message pops up with a little mouse logo and the whole process repeats itself. after I am done with that message a third dialog appears. This is happening every time I boot up my PC (a desktop), I tried following an advice I found in some forum and download windows update for ID non-user input data filter, but that installation failed as well. The thing is, that both keyboard and mouse are working fine Is there anyway to get past these dialogs ?

    Read the article

  • gmetric data submitted doesn't follow dmax value

    - by 580farm
    I have a custom script that is querying a metric port for an application that I'm running and submitting parsed values to ganglia via gmetric. The script runs every minute, so I submit the data to ganglia using the following gmetric options: /usr/sbin/gmetric -g ec2 -s positive -t uint32 -d 600 -n "$NAME" -v $VALUE -x 60 But for some reason there are still gaps in the graphing data: Is there something in my formatting that is preventing the dmax/ttl of the last metric received from being honored? Is there anyone who does custom metric collection that has run into this problem before that can shed some insight or provide some tips as to how to best correct this?

    Read the article

  • Need to pull data from website after every 5 seconds using Vba

    - by Milton
    I need to pull data from www.dsebd.org after ever 5 seconds. this Vba code pull data but does not run automatically. Please help me. Sub ButtonCode() ' execute macros Call GetCotton ' submit macro to run again in 5 sec Application.OnTime Now + TimeValue("00:00:05"), "ButtonCode" End Sub Sub GetCotton() Dim xml As Object Dim html As Object Dim elemcollection As Object Dim result As String Dim t As Long, r As Long, c As Long, ActRw As Long Set xml = CreateObject("MSXML2.XMLHTTP.6.0") With xml .Open "GET", "http://www.dsebd.org/dseX_share.php", False .send End With result = xml.responseText Set html = CreateObject("htmlfile") html.body.innerHTML = result Set elemcollection = html.getElementsByTagName("table") For t = 0 To elemcollection.Length - 1 For r = 0 To elemcollection(t).Rows.Length - 1 For c = 0 To elemcollection(t).Rows(r).Cells.Length - 1 ThisWorkbook.Sheets("Sheet1").Cells(ActRw + r + 1, c + 1) = elemcollection(t).Rows(r).Cells(c).innerText Next c Next r ActRw = ActRw + elemcollection(t).Rows.Length + 1 Next t End Sub

    Read the article

  • Send request body data when running siege

    - by qui
    I am trying to use the command line utility Siege to load test a service. The service recieves json in the request body via a POST. I have a file called example-data.json with the json inside. I will eventually turn this into a tiny service which creates random json for testing, but this should do for now I have another file called hit-qa.siege with http://www.qa-url.com POST < example-data.json and i try and run siege -c10 -d1 -r1 -f ops/perf/hammer-dev.siege When I check the logs of the service, it is not recieving anything in the request body. My googles have been fruitless, does anyone know how to accomplish this?

    Read the article

  • Recover deleted files on windows 2008 file server

    - by aniga
    We have recently been hit by a weird virus which made all files and folders a system files/folders and also it hid all files and folders par some weird ones it created including: ..exe porn.exe secret.exe password.exe etc We have managed to restore the files with attrib command to unhide and unmark them as system files however we have noticed that we are missing some 4 to 5 folders of which (based on my luck) 2 of them are the two most important client we have. I am not sure if these files were deleted by the worm/virus or by my colleagues who are not owning up to them but the files are now gone. Worst of all, we do not have any backup what so ever (Yes I know, we should not have done that but it is a lesson learned and since last night we have created two forms of backup systems one to external device and one on the cloud, but I doubt any of that will help us now) We have 1 Windows 2008 File server and 4 client computers based on Windows 2007. I would be grateful if anyone can help us on how we can recover from this disaster which could potentially put us out of business.

    Read the article

  • Outlook 2010 + Move IMAP PST file = Outlook data file cannot be accessed

    - by GWB
    I set up a new IMAP account in Outlook 2010. It works but creates IMAP PST file in C:\Users\User\AppData\Local\Microsoft\Outlook. I want the file on my data drive in D:\Users\User\Documents\Outlook Files (the same folder where outlook automatically creates the local Outlook PST. I followed the instructions here to move the IMAP PST. Testing the account (send/receive) works fine, but if I try to manually send an email I get error 0x8004010F Outlook data file cannot be accessed. I've tried repairing the PST using SCANPST (it always finds errors), and deleting and recreating the account but I get the same error. If I move the PST file back, it works again, but this is not ideal. Note: I don't think this is a duplicate of this question as the cause is different and the solution does not help.

    Read the article

  • Combine OS partion with data partition on NAS4Free/FreeNAS

    - by Pak
    I recently built a NAS4Free (formerly FreeNAS) machine using a 256MB (yes, MB) USB drive for the OS. When I did the original install, I had the bright idea of making the OS partition just big enough for the OS and a then creating a second partition using the remainder of the drive to store stuff pertaining to the OS. I never really found a use for the data partition and I ended up running out of space on the OS partition, so now I'd like to combine the partitions into a single partition. Is this something that is possible to do while everything is up and running? If it comes down to it, I can take down the machine and do a fresh install of the OS using the entire space of the USB drive, but I'd like to use this as an opportunity to better familiarize myself with FreeBSD/UNIX type systems. If this is possible, will it interfere with the NAS4Free things? The data partition shows up in the web interface under the disks section. If I end up manually changing the partitions, I'd be concerned with NAS4Free getting confused by the missing partition.

    Read the article

  • Pass User Data to AWS client

    - by bearrito
    Has anyone successful passed user data to the AWS CLI ? I have tried various incantations of the following but it does not work. Docs say string must be base64 encoded : http://docs.aws.amazon.com/cli/latest/reference/ec2/run-instances.html The instance logs never indicate the script is executed and chef is installed. aws ec2 run-instances --image-id ami-a73264ce --count 1 --instance-type t1.micro --key-name scrubbed --iam-instance-profile Arn=arn:aws:iam::scrubbed:instance-profile/scrubbed --user-data $(base64 chef_user_data.sh --wrap=0) chef_user_data.sh #!/bin/bash curl -L https://www.opscode.com/chef/install.sh | sudo bash

    Read the article

  • How can I pin point a USB file transfer bottleneck in Unix?

    - by HankHendrix
    I'm experiencing very slow data transfer speeds over USB 2.0 on my nix box and was wondering how I can pin-point the cause of the problem. I've looked into iotop and top but the cpu and mem figures look normal (compared to guides I have checked). The box which is affected is Ubuntu 12.04 32bit Server running on an Asus EEE 701 2G model and I am transferring from the OS over USB 2.0 to an external HDD (which transfers at 30MB/s+ on Windows 7 on other machine). I get rsync write speeds of 1MB/s from OS to USB HDD which seems ridiculously slow. These speeds are consistent with other USB HDDs and sticks.

    Read the article

< Previous Page | 328 329 330 331 332 333 334 335 336 337 338 339  | Next Page >