Search Results

Search found 58486 results on 2340 pages for 'data integrator'.

Page 721/2340 | < Previous Page | 717 718 719 720 721 722 723 724 725 726 727 728  | Next Page >

  • Converting Outlook Express csv adress book and dbx files into Thunderbird on W7

    - by PiotrK
    Recently I changed my OS from XP to W7. I made backup of any Outlook Express messages (the dbx files and adress book as CSV). On W7 I want to import that data into Thunderbird. There is option for importing from Outlook Express, but it is looking for live application data (I can't specify directory with real files myself) and there is no Outlook Express installed on W7 so I can't just import it back to it and then into Thunderbird. How can I import that data into Thunderbird?

    Read the article

  • Printer monitoring script (PowerShell)

    - by HannesFostie
    I am going to write a script of some sort to check event viewer in a windows server 2003 for all printjobs, and then write them to a comma delimited textfile like printername_floor_room.txt I am wondering what the best way is to do this realtime, and keep checking the event viewer constantly. Any caveats I need to be aware of? Thanks EDIT: Okay, so I will most likely go for PowerShell and use Get-EventLog and then edit the "table" data. Problems I'm having: if I were to save all this data to a text file, how do I get the data out of it? A comma-separated file I could work with, but this, I'm not really sure. And once that is sorted out, I'm still not sure how to keep the file updated more or less real-time. Can I make this service-like, without hogging up all resources? Run it every x seconds for example?

    Read the article

  • Move MySQL database while instance is online

    - by Mike Scott
    I have a MySQL instance containing a number of databases, one of which is an archive database (although using the INNODB rather than ARCHIVE storage engine) that is not queried or written to in normal operation. The data filesystem is filling up and I'd like to move the archive database's data directory to a different filesystem (and then symlink it back, obviously). If there are no SQL statements attempting to query or update the data during the move, can I safely do this while the MySQL instance and the other databases stay online and in use? I plan to rsync the database directory to the new filesystem, then rename the old one on the original filesystem to something different and create the new symlink. lsof reports that MySQL does have the .ibd files open, so presumably it would have to reopen them.

    Read the article

  • What will happen with my RAID5 after motherboard change?

    - by abatishchev
    Currently I have ASUS P5Q-EM and 3 HDD in RAID5 using it's on-board RAID controller Intel ICH10R. I want to bye a new motherboard, for example, Gigabyte GA-EQ45M-S2 which also have on-board RAID controller, but Intel ICH10DO. What will happen with my data on RAID5? Will I have to re-create the array from the scratch and lost all my data? Is such array a soft RAID or soft-hard? What if my current motherboard will broken? What will happen with my data?

    Read the article

  • Amazon EBS root volume persistence

    - by hipplar
    When I launch a new Windows EC2 instance I am given a 30 gig root EBS volume. I'm trying to make sure I understand the EBS terminology and want to make sure I understand this correctly: Q: What happens to my data when a system terminates? The data stored on a local instance store will persist only as long as that instance is alive. However, data that is stored on an Amazon EBS volume will persist independently of the life of the instance. What exactly does "instance is alive" mean? If I write files to the root volume and reboot the instance will the files remain? Or do I physically have to terminate (delete) the instance for the root volume to go away? Thanks

    Read the article

  • Getting live traffic/visitor analytics when using a reverse proxy

    - by jotto
    I'm in process of implementing Varnish as a reverse proxy for a Ruby on Rails app and I'm using Google Analytics (JS/client side script to record visitor data) but it's several hours delayed so its useless for knowing what's going on now. I need at a glance live data that includes referring traffic and what current req/sec is. Right now I am using a simple Rack middleware application to do the live stats (gist.github.com/235745) but if the majority of traffic hits Varnish, Rack will never be hit so this won't work. The closest solution I've found so far is http://www.reinvigorate.net/ but it's in beta (there are also no implementation details on their front page). Does Varnish have traffic logs that I can custom format to match my Apache logs so I can combine them, or will I have to roll my own JS implementation like GA that shows the data in real time?

    Read the article

  • Rugged Netbook and software options for Running R statistical software?

    - by Thomas
    I am interested in purchasing a netbook to do field research in another country. My hardware specifications for the nebtook: -Be rugged enough to survive a bit of wear and tear -Fairly fast processing (the ability to upgrade from 1GB of RAM to 2GB) -A battery life of longer than 6 hours -At least a 10 inch screen -A decent camera for Skyping My software needs: -Be able run a Spreadsheet program to do basic data input (like Excel or Open Office) -Use the Open Source Statistical Program R to do basic data analysis (Regression, data analysis etc.) -Word Processing (Word or Open Office) Do you have any suggestions on which models or brands my fit my needs? Some of the models I was considering: Samsung NB-30 Toshiba NB 305 Asus Eee PC 1005HA Lenovo S10-2 Also any recommendations on the best software to load on the aforementioned Netbooks. I saw this resource from [livehacker][6] Any comments. Thanks for your help! -Thomas

    Read the article

  • Reverse lookup SERVFAIL

    - by Quan Tran
    I just set up a DNS server and a web server using Virtualbox. The IP address of the DNS server is 192.168.56.101 and the web server 192.168.56.102. Here are my configuration files for the DNS server: named.conf: // // named.conf // // Provided by Red Hat bind package to configure the ISC BIND named(8) DNS // server as a caching only nameserver (as a localhost DNS resolver only). // // See /usr/share/doc/bind*/sample/ for example named configuration files. // options { directory "/var/named"; dump-file "/var/named/data/cache_dump.db"; statistics-file "/var/named/data/named_stats.txt"; memstatistics-file "/var/named/data/named_mem_stats.txt"; //query-source address * port 53; //forward first; forwarders { 8.8.8.8; 8.8.4.4; }; listen-on port 53 { 127.0.0.1; 192.168.56.0/24; }; allow-query { localhost; 192.168.56.0/24; }; recursion yes; dnssec-enable yes; dnssec-validation yes; dnssec-lookaside auto; /* Path to ISC DLV key */ bindkeys-file "/etc/named.iscdlv.key"; managed-keys-directory "/var/named/dynamic"; }; logging { channel default_debug { file "data/named.run"; severity debug 10; print-category yes; print-time yes; print-severity yes; }; }; zone "quantran.com" in { type master; file "named.quantran.com"; }; zone "56.168.192.in-addr.arpa" in { type master; file "named.192.168.56"; allow-update { none; }; }; include "/etc/named.rfc1912.zones"; include "/etc/named.root.key"; named.quantran.com: $TTL 86400 quantran.com. IN SOA dns1.quantran.com. root.quantran.com. ( 100 ; serial 3600 ; refresh 600 ; retry 604800 ; expire 86400 ) IN NS dns1.quantran.com. dns1.quantran.com. IN A 192.168.56.101 www.quantran.com. IN A 192.168.56.102 named.192.168.56: $TTL 86400 $ORIGIN 56.168.192.in-addr.arpa. @ IN SOA dns1.quantran.com. root.quantran.com. ( 100 ; serial 3600 ; refresh 600 ; retry 604800 ; expire 86400 ) ; minimum IN NS dns1.quantran.com. 101.56.168.192.in-addr.arpa. IN PTR dns1.quantran.com. 102 IN PTR www.quantran.com. When I try a normal lookup from the host (I configured so that the only nameserver the host uses is the DNS server 192.168.56.101): quan@quantran:~$ host www.quantran.com www.quantran.com has address 192.168.56.102 quan@quantran:~$ host dns1.quantran.com dns1.quantran.com has address 192.168.56.101 But when I try a reverse lookup: quan@quantran:~$ host -v 192.168.56.101 192.168.56.101 Trying "101.56.168.192.in-addr.arpa" Using domain server: Name: 192.168.56.101 Address: 192.168.56.101#53 Aliases: Host 101.56.168.192.in-addr.arpa not found: 2(SERVFAIL) Received 45 bytes from 192.168.56.101#53 in 0 ms quan@quantran:~$ host -v 192.168.56.102 192.168.56.101 Trying "102.56.168.192.in-addr.arpa" Using domain server: Name: 192.168.56.101 Address: 192.168.56.101#53 Aliases: Host 102.56.168.192.in-addr.arpa not found: 2(SERVFAIL) Received 45 bytes from 192.168.56.101#53 in 0 ms So why can't I perform a reverse lookup? Anything wrong with the zone configuration files? Thanks in advance :) Oh, here is the output from the log file /var/named/data/named.run when I perform the reverse lookup: quan@quantran:~$ host 192.168.56.102 192.168.56.101 Using domain server: Name: 192.168.56.101 Address: 192.168.56.101#53 Aliases: Host 102.56.168.192.in-addr.arpa not found: 2(SERVFAIL) /var/named/data/named.run: 02-Jun-2014 15:18:11.950 client: debug 3: client 192.168.56.1#51786: UDP request 02-Jun-2014 15:18:11.950 client: debug 5: client 192.168.56.1#51786: using view '_default' 02-Jun-2014 15:18:11.950 security: debug 3: client 192.168.56.1#51786: request is not signed 02-Jun-2014 15:18:11.950 security: debug 3: client 192.168.56.1#51786: recursion available 02-Jun-2014 15:18:11.950 client: debug 3: client 192.168.56.1#51786: query 02-Jun-2014 15:18:11.950 client: debug 10: client 192.168.56.1#51786: ns_client_attach: ref = 1 02-Jun-2014 15:18:11.950 query-errors: debug 1: client 192.168.56.1#51786: query failed (SERVFAIL) for 102.56.168.192.in-addr.arpa/IN/PTR at query.c:5428 02-Jun-2014 15:18:11.950 client: debug 3: client 192.168.56.1#51786: error 02-Jun-2014 15:18:11.950 client: debug 3: client 192.168.56.1#51786: send 02-Jun-2014 15:18:11.950 client: debug 3: client 192.168.56.1#51786: sendto 02-Jun-2014 15:18:11.951 client: debug 3: client 192.168.56.1#51786: senddone 02-Jun-2014 15:18:11.951 client: debug 3: client 192.168.56.1#51786: next 02-Jun-2014 15:18:11.951 client: debug 10: client 192.168.56.1#51786: ns_client_detach: ref = 0 02-Jun-2014 15:18:11.951 client: debug 3: client 192.168.56.1#51786: endrequest 02-Jun-2014 15:18:11.951 client: debug 3: client @0xb537e008: udprecv Also, I made some changes to the log section in named.conf.

    Read the article

  • Web browsing over SSH

    - by Alex Marshall
    Hello, I have something of a difficult situation : our company has a webserver in a remote data center that's, at the moment, only accessible by SSH and the firewall is not easily modifiable because the techs at the data center are unreliable and unreachable lately (not my choice of data center, and switching is not an option at the moment). Are there any browsers or plugins out there that will let me browse over an SSH connection ? I can browse with links and lynx on the SSH command line, but that doesn't give me access to various functionality I need, and it's too hard to find things in the web application running on a Tomcat server on the box that I need access to. Does anybody have any suggestions ? We're already working on getting direct access to the web application by having the firewall opened up, but I need something better in the mean time.

    Read the article

  • MSSQL Auditing Recomendations

    - by Josh Anderson
    As an aspiring DBA, I have recently been asssigned the task of implementing the tracking of all data changes in the database for a peice of software we are developing. After playing with microsoft's change data capture methods, Im looking into some other solutions. We are planing to distribute our product as a hosted solution and unlimited installations would be desired for maximum scalability. Ive looked at IBM's Guardium as well as DB Audit by SoftTree. Im curious if anyone has any solutions they may have used in the past or possibly any suggestions or methods to achieve complete, and of course cost effective, auditing of data changes.

    Read the article

  • after upgrade to outlook 2013 Contacts have the default picture in reading pane

    - by Juan Zamudio
    I'm using Windows 8 on a domain (using a domain account not a Microsoft account) connecting to Exchange, all my Contacts were outlook contacts with pictures and other data, I could see the picture of the contact in the reading pane and people pane while using Outlook 2010. After the upgrade to Outlook 2013 I can see my outlook contacts but it appears that they are unavailable in the reading pane because all i can see is the default picture (the default picture picture is also visible in the notifications). If I put the mouse over the name of the people while in the reading pane all I can see is the card with the default info (the only data is that the contact is available in the next 8 hours), if i compose a new message and hover the name of the contact i can see the card with all the data (picture, company, etc.). Is this the default behavior in Outlook 2013 if you are in a domain and not connected to any service or there is a way to show the picture of my contacts in every part of outlook?

    Read the article

  • php server settings for restrict post queries

    - by Korjavin Ivan
    I have php script on hosting, which receive big data with ajax/post. Just now, after some works on hosting, I see that script is broken. I checked with curl: file temp1: user_avatar=&user_baner=&user_sig=.... 237 chars total, and curl -H "X-Requested-With: XMLHttpRequest" -X POST --data @temp1 'http://host/mypage.php' works perfect. But with file temp2: name=%D0%9C%D0%B5%D0%B1%%B5%D0%BB%D1%8C%D0%A4%%B0%D0%B1%D1%80%D0%B8%D0%BA%D1%8A&user_payed=0000-00-00&...positions%5B5231%5D=on total chars: 65563 curl -H "X-Requested-With: XMLHttpRequest" -X POST --data @temp2 'http://host/mypage.php' curl return nothing. Looks like a problem with apache/php/php.ini or something like that. I check .htaccess php_value post_max_size 20M Which other parameters I should check? Is it possible that %BO encode kill php/apache? Or total number of parameters (about 2800) ?

    Read the article

  • Losing windows XP files. How to retrieve them?

    - by ravi
    I have a portable 500 GB HDD.Since last few days, all files of some folder are getting kinda corrupted.I can't access/delete them. Here is the example. If I try to access these files/folders, I get following error. This thing is kinda spreading in my HDD.Till now it has affected two folders worth 70 GB.In which one folder is backup folder where all my important data resides.So I am really in loss if I loose this data. How can I retrieve this data? Please help.

    Read the article

  • Passing a file with multiple patterns to grep

    - by Michael Goldshteyn
    Let's say we have two files. match.txt: A file containing patterns to match: fed ghi tsr qpo data.txt: A file containing lines of text: abc fed ghi jkl mno pqr stu vwx zyx wvu tsr qpo Now, I want to issue a grep command that should return the first and third line from data.txt: abc fed ghi jkl zyx wvu tsr qpo ... because each of these two lines match one of the patterns in match.txt. I have tried: grep -F -f match.txt data.txt but that returns no results. grep info: GNU grep 2.6.3 (cygwin) OS info: Windows 2008 R2 Update: It seems, that grep is confused by the space in the search pattern lines, but with the -F flag, it should be treating each line in match.txt as an individual match pattern.

    Read the article

  • Recommended RAM and disc space for Oracle 11g on Windows

    - by Álvaro G. Vicario
    I need to provide the recommended amount of RAM and disc space (divided in two partitions) so the customer can create an appropriate virtual machine to run Oracle. All I could find in the documentation was a brief listing with minimum RAM and typical/advanced install types. The virtual machine will run latest Oracle Standard Edition One (11g release 2 so far) under Windows Server 2008 x64 and will host a reasonably low traffic web application. How much RAM and disc must I ask for in order to be safe? (Feel free to ask for further details if I've omitted something relevant.) Update: Rough estimations: Database size: 10 MB after installation Growth rate: +3MB per day on average Size of database 'active' data: (not sure of what this means, there's not actual archive so I guess all data is current) Amount of data written per second in peak hours: a few KB Number of client sessions: 3 or 4 at most Frequency and response size of most heavy requests: some reports make heavy table JOINS that need up to 20 seconds to complete but they won't return more than a few thousand rows with plain text. The app also handles BLOBs (typical size from 50KB to 200KB)

    Read the article

  • SSL FTP fails on Windows 7 but not Windows XP clients

    - by Andrew Neely
    We currently use a free SSL-FTP client called Move-It-Freely to transmit data from a custom data entry program at over forty facilities scattered around the state to our central server. Under XP, it works flawlessly. Some facilities have upgraded to Windows 7. On these machines, uploads (transfers to us) work, downloads (transfers from us to them) fail. Replacing the Windows 7 machine with an XP machine solves the problem. We have also verified that the network firewall settings have not changed. This problem persists even if Windows firewall is not running. We were able to remote into one of the Windows 7 machines to verify that the Windows firewall was indeed turned off. We cannot replicate the problem on our own Windows 7 machines, and are at a loss of how to fix this feature for our customers. The data contain health-related information, and needs to be encrypted (hence SSL-FTP.) Despite hours spent on Google, we cannot find a solution.

    Read the article

  • How to sync a folder on a remote computer to a server on a domain

    - by Pierre-Alain Vigeant
    We have a small remote office that often share data with us. I learned that the data is shared as a email attachment, but that obviously leads to versioning hell and overriding. I am looking for a way for then to synchronize a folder directly on our main office domain controller. I personally use LiveMesh, but I would like a tool that is synchronized to our server directly without a 3rd party hosting the data, since we already have an online backup service taking care of the offsite backup. What enterprise class tool would let us synchronize a folder from a remote computer that is out of our domain, into our the file server of our domain? The synchronization has to be two-way, e.g.: Someone from the remote office will create an invoice. Someone from our office will review it and make modification to it. The remote office need to see the change. Our server is on Windows 2003.

    Read the article

  • Calculate geometric mean in Excel

    - by Libby
    I have some email network data in Excel as a edgelist meaning I have columns Vertex1, Vertex2, and then N columns of properties of that edge like how many emails were sent from one person to another. For each row in the data, Vertex1 is the source of a message, and Vertex2 is the target, so edges are directed. Here's some sample data Vertex1 Vertex2 nMessages Bob Cindy 12 Cindy Bob 3 Bob Mike 11 Cindy Mike 1 I'm trying to calculate a geometric mean of the form gm = sqrt[(# of edges ij)*(# of edges ji)] So gm for Bob and Cindy is gm = sqrt[(messages from Bob to Cindy)*(messages from Cindy to Bob)] or sqrt(12*3) = 6. Is there a way to make that a formula in Excel?

    Read the article

  • How to consolidate servers with the not-very-strong infrastructure

    - by Sim
    All, Situation We are in retail industry with about 10 distributors and use Solomon as the standard ERP for all our systems Each distributor has 1 HQ and 5 - 10 branches, each branch has their own server (Windows 2000/XP/2003 + Solomon + another built-in POS system) Everyday, branches has to extract data and send (via email/Skype) to HQ for data consolidation purpose When we first deployed our ERP, the infrastructure (e.g. Internet connection) wasn't reliable enough. That's why we went with the de-centralized model (each branch got their own server) Now, the infrastructure is mature already. And we need to consolidate data more quickly (not from branches -- HQ -- our company but something like HQ -- our company only) Goal We just have Solomon servers in distributor HQ. All the transactions in branches (retrieved from POS) will by synchronized with HQ server directly) There is a backup plan just in case the Internet goes down, or HQ server goes down Question With the above question, could you guys suggests some model for me ? Should we use Terminal services, any other solutions ? Any watchout/suggestions ? Any good article to read 'bout this ? Thanks a lot

    Read the article

  • ODSI + weblogic = JDBC problem

    - by Giuseppe Di Federico
    I'm currently developing a web service using ODSI through Oracle Workshop for WebLogic (ex AquaLogic). I created a datasource on weblogic using the driver "Oracle thin driver 10g", the test succeed on WebLogic. (My Database is Oracle 10 10.2.0.1.0) The problem occours when I try to create the Phisical Data Service in the Oracle Workshop. I choose the following options: Data source type = Relational Data source = [THE CORRECT NAME OF THE SOURCE SET ON WEBLOGIC] Database type = ??? Aqualogic, doesn't allow me to select the database type. I guess is a problem related to the driver set on weblogic... but I ain't sure.Does someone know the nature of my problem ? Tnx

    Read the article

  • Automatic LaTex document generation from Excel spreadsheet

    - by Bowler
    I have some data in an excel file from which I have to generate a report. I repeat this task fairly regularly and am looking to automate it. I have a LaTeX project into which I usually just copy data by hand, export the necessary worksheets as pdfs and add them to my LaTeX project and compile with pdflatex. It has occured to me that there must be a way to automate this process. Is there an efficient way to export the data from excel and into a LaTeX project, possibly a vba script in excel could run the process? Also, it doesn't have to be LaTeX, I'm not all that experienced with MS office's more advanced features is there some way akin to a mail merge that I could achieve this with? In some ways this might be better in case I have to pass the work on to someone who doesn't know LaTeX. Thanks.

    Read the article

  • NetFlow Storage Calculator

    - by javano
    I am planning to deploy a NetFlow server (using NfSen/NfDump) for harvesting data from Cisco devices; Are there standard calculations or guidelines I can use to calculate my server requirements, specifically I need to plan for storage. Is there a way of knowing how much data I will collect per day for example, given N flows? Lets say one device has 10k flows per day, this is typically XYZ MBs, so I can scale this up? If not, how many flows are you guys and girls recording per day, and how much data is this generating? Hopefully we can generate an estimate from everyone else's figures! P.S. If it makes a difference, I'll be collecting from <= 50 devices max (non more than 50Mbps each).

    Read the article

  • migrate SharePoint to SBS Server

    - by Eric Lorson
    We have a SharePoint 2003 server and we need to migrate that data to SharePoint 2011 on a SBS server. We cannot use the migration tool because one of the servers is SBS and the other is not. We exported the SharePoint data from the old system, but the import to the SBS SharePoint is failing with very little info on why. I think that there is a schema conflict, but I am not that familiar with SBS and I am not finding the error in the Windows logs. Has anyone had to migrate data from non-SBS system to an SBS system? Or can anyone help me figure out where to look for more info on what is going on?

    Read the article

  • pnp4nagios does not generate perfdata

    - by gonvaled
    I am running nagios2, pnp4nagios-0.6.16 and php 5.2.4-2ubuntu5.19. In my setup, pnp4nagios is correctly generating perfdata, which can be seen via the web interface in graphical form for lots of services. The perfdata directory contains entries of the kind: /usr/local/pnp4nagios/var/perfdata/zeus/Disk_Space_Home.rrd /usr/local/pnp4nagios/var/perfdata/zeus/Disk_Space_Home.xml I have activated performance data for a new nagios service: define serviceextinfo { host_name zeus service_description 450average action_url /pnp4nagios/index.php?host=$HOSTNAME$&srv=$SERVICEDESC$ } This service is generating monitoring data in the format: status_info|perf_data as required for performance gathering. But somehow the performance data related to this service is not being collected by pnp4nagios (no related entries in /usr/local/pnp4nagios/var/perfdata) Are there any pnp4nagios scripts or settings which I could use to debug this?

    Read the article

  • Can I mark a folder as mountpoint-only?

    - by Collin
    I have a folder ~/nas which I usually use sshfs to mount a network drive on. Today, I didn't realize the share hadn't been mounted yet, and copied some data into it. It took me a bit to realize that I'd just copied data into my own local drive rather than the network share. Is there some way to mark in the system that this folder is supposed to be a mount point, and to not let anyone copy data into it? I tried the permissions solution here: How to only allow a program to write to a directory if it is mounted?, but if I don't have write access I also can't mount anything to it.

    Read the article

< Previous Page | 717 718 719 720 721 722 723 724 725 726 727 728  | Next Page >