Search Results

Search found 21336 results on 854 pages for 'db api'.

Page 554/854 | < Previous Page | 550 551 552 553 554 555 556 557 558 559 560 561  | Next Page >

  • WGet from one site on a server to another site on the same server

    - by JoshReedSchramm
    Hey all, I've recently been asked to administer a couple ubuntu boxes running web servers. I'm a dev by trade so if this question is fairly noob please forgive. We have about a dozen sites running on this box. 2 of our sites need to talk back and forth over a restful api. Unfortunately we are having issues with the sited connection to each other via wget. When we try and run wget manually from the command line from the server pointing to a site also on that server it hangs and eventually times out. If we do the same thing from outside the server to the same site on the server it works. Is there something that could be preventing sites on the same server from communicating with each other? The same thing happens pinging the site from the server.

    Read the article

  • Network Block Device (NBD) clients for Windows or similar solutions

    - by przemoc
    Are there any NBD clients for Windows? Strangely, I cannot find any, or I am searching for them in a wrong way. Such client should be possibly a driver with front-end tool (may be a command-line one) allowing to create virtual drives and associate them with given hosts (or simply localhost) and ports where NBD servers are listening. From user perspective virtual drive should be close to what physical drive is, so it should be accessible as something like \\.\PhysicalDriveX (maybe \\.\VirtualDriveX?), be visible in Disk Management (diskmgmt.msc) and mountvol tools at least. (The only thing I found remotely close to NBD on Windows is ImDisk's proxy mode and companion tool devio, but AFAIK ImDisk only works at partition level (so no virtual drive) and devio uses different protocol.) Secondary question is: Are there any (preferably simple) Windows-specific solutions allowing creation of virtual drive delegating read/write request to user-space via some explicit way (like via TCP, IPC, DLL implementing given API, etc.)?

    Read the article

  • SQL server availability issue: large query stops other connections from connecting

    - by Carlos
    I've got a high-spec (multicore, RAID) server running MS SQL 2008, with several databases on it. I have a low throughput process that periodically needs a small amount of information from one of the DBs, and the code seems to work fine. However, sometimes when one of my colleagues does a huge query against one of the other DBs, I see full CPU usage on the machine, and connections from my app time out. Why does this happen? I would have thought the many cores and harddisks would somehow (together with cleverly written DB server) be able to keep at least some of the resources free for other apps? I'm pretty sure he doesn't use multiple connections for his query. What can I do to prevent this?

    Read the article

  • Convert SQL Query results to Active Directory Groups

    - by antgiant
    Are there any quality products (ideally open source) that allow me to run an arbitrary SQL query that results in 2 columns (username, group name) and they adds that username in AD to a group of that name in AD? If the username doesn't exist it is ignored. If the group name doesn't exist ideally it gets created. Updated for Clarity: I have a MSSQL based system that is the authoritative source for some of the Active Directory Security groups, and their members. I want to be able to to have those Active Directory Security Groups populated by a one-way sync originating from MSSQL. Sadly the MSSQL based system does not have a good API, so I will have to do this with direct SQL calls. Is there anything that does this well?

    Read the article

  • Connect to MySQL trough command line without need root password

    - by ReynierPM
    I'm building a Bash script for some tasks. One of those tasks is create a MySQL DB from within the same bash script. What I'm doing right now is creating two vars: one for store user name and the other for store password. This is the relevant part of my script: MYSQL_USER=root MYSQL_PASS=mypass_goes_here touch /tmp/$PROY.sql && echo "CREATE DATABASE $DB_NAME;" > /tmp/script.sql mysql --user=$MYSQL_USER --password="$MYSQL_PASS" < /tmp/script.sql rm -rf /tmp/script.sql But always get a error saying access denied for user root with NO PASSWORD, what I'm doing wrong? I need to do the same for PostgreSQL, any help? Regards and thanks in advance

    Read the article

  • Bind dns server in Solaris 10 and win xp clients

    - by stevecomptech
    Hi, Added this in zone db file, i am running solaris 10 _ldap._tcp.mydomain.com. SRV 0 0 389 dc.mydomain.com. _kerberos._tcp.mydomain.com. SRV 0 0 88 dc.mydomain.com. _ldap._tcp.dc._msdcs.mydomain.com. SRV 0 0 389 dc.mydomain.com. _kerberos._tcp.dc._msdcs.mydomain.com. SRV 0 0 88 host.mydomain.com. Now i get this error when i try to join win xp to the domain The query was for the SRV record for _ldap._tcp.dc._msdcs.mydomain.com The following domain controllers were identified by the query: host.mydomain.com Common causes of this error include: Host (A) records that map the name of the domain controller to its IP addresses are missing or contain incorrect addresses. Domain controllers registered in DNS are not connected to the network or are not running. What do i need to change in order my win xp join the domain

    Read the article

  • Backing up data (including mysqldumps) to S3

    - by seengee
    We have a web app on a number of servers and we want to add an additional layer of redundancy by backing up the key data to S3. The key data is the MySQL database and a folder containing dynamically created site assets - predominantly images. Some kind of rsync based solution would initially seem the best plan. A couple of years ago we played with S3cmd (in particular s3cmd sync) with some success but we didn't find it particularly reliable although this may have changed since. Its occurred to me though that a rsync solution might not work particularly well with a single db.sql file created with mysqldump and I assume this means the whole database getting transferred each time, with multiple databases of over 1GB this is going to add up to a lot of traffic (and $s) very quickly. With the image files I could simply just transfer files modified within the last day which would be far more simple. What approach should I look at?

    Read the article

  • How can I insert the quoted price of gold from kitco.com into my excel spreadsheet?

    - by Frank Computer
    kitco.com provides a realtime price quote for gold and other metals. I have a spreadsheet which makes calculations based on the price of gold and would like for this realtime value to automatically be updated on my excel sheet. I tried 'get external data' from a website but that didn't work. any ideas? EDIT ADDED: Kitco has a gadget called KCAST which displays realtime quotes on the Windows taskbar. I tried capturing those values from the taskbar but that didn't work either. Maybe if Kitco provided an API or feed, it could be done?

    Read the article

  • Restore Picasa people tags

    - by Paul
    I have loaded Windows 7 to my laptop. Before doing this I backed up all my pictures using the Picasa backup utility. I then ran a restore on the clean Windows 7 install. I then installed Picasa 3.5 and none of the people tags showed up. I then went and deleted what I thought was the Picasa DB and then tried running the restore again. Now each folder shows up twice in Picasa but only once under the Windows Pictures folder. How do I get rid of the duplicates in Picasa and get my people tags back?

    Read the article

  • Rsync fails for files that start with underscore when destination is zfs

    - by Eric
    everyone. I'm using rsync3.1.0pre1 on Mac OS X 10.8.5, and am trying to rsync one folder to another. The destination is a ZFS volume mounted via SMB. The problem I'm having is that files that start with underscore (e.g., '_filename.jpg') are not being successfully synced to the destination. I get the following error message: rsync: mkstemp "/path/to/destination/._filename.jpg.NUgYJw" failed: Permission denied (13) In this case, '_filename.jpg' does not make it to the destination. I understand that rsync creates hidden, temporary files at the destination which are preceded with '.' and have a random file extension appended on the end. But the original filename starts with '', not '.', and I haven't asked rsync to copy extended attributes / resource forks over (unless it always does it). The rsync command I'm using is: rsync -avE --exclude='.DS_Store' --exclude '.Trash' --exclude 'Thumbs.db' --exclude '._*' --delete /source/ /destination/ Has anyone found a way around this problem? Thank you!

    Read the article

  • Apache httpd + FreeTDS hangs until restarted

    - by Jordan Reiter
    Every so often requests to a Linux server (say, linux.example.org) where the web app (Django) pulls in data from a SQL Server database via FreeTDS will hang. Requests on other servers pointing to the database still work, as do requests on linux.example.org that use local MySQL databases. Only the server plus FreeTDS appear to be affected. Restarting httpd makes the database connections work correctly again. What could cause this problem? Using: Centos 5.9 freetds 0.91 Apache httpd 2.2.3 /etc/obdc.ini: [DSN] Description = SQL Server 2005 Driver = FreeTDS ;Database = dbname Servername = SERVERNAME ;TDS_Version = 8.0 /etc/freetds.conf: [SERVERNAME] driver = /usr/lib64/libtdsodbc.so host = db.example.org port = 1433 tds version = 8.0 client charset = UTF-8

    Read the article

  • TS connection lost but not local

    - by Owl
    I have an office that is connected to a shared db server and terminal services server, that other offices use as well, through a vpn tunnel. For some reason all workstations in the office will lose connectivity to the remote address but will remain connected to their local internet. I have checked both firewalls to ensure all settings match accordingly. They seem to go down at routine times every day and it doesn't last longer than a minute or two. Any ideas of what this may be? OS: Windows Server 2003R2 Terminal Services 5.2 Symantec Gateway 320 & Symantec Firewall/VPN 100

    Read the article

  • how to install OpenSSL in windows 7 and also how to check, its enabled or not?

    - by Andy
    how to install OpenSSL in windows 7 and also how to check, its enabled or not? I currently run php through the command line locally not on a server. Thanks I recently installed php 5.2.17 I ran a program which connect with a https server and I got the following error Notice: file_get_contents(): Unable to find the wrapper "https" - did you forget to enable it when you configured PHP? in C:\java\newsweaver-api-v2\simple\list- tags.php on line 30 I added extension=php_openssl.dll to php.ini but I'm wondering is openssl native to php 5.2.17 or do I need to download an extntion. Thanks

    Read the article

  • Can not join comp to the domain... greyed out

    - by Logman
    I have an old WinXP Pro SP3 computer I need to join to the domain, simple right? not really. When I go to control panel - system - computer name and click on CHANGE ("rename this computer") everything is greyed out. I can not set it from workgroup to a domain. I am logged on locally as an admin. (Builtin account and one I created) I have checked local policy (gpedit.msc) on the comp, but it feels like a needle in the haystack. I could probably reload an image faster than trying to fix this...but I am curious so I post here to see if anyone knows of it/fix. I tried reseting the policy to defaults, but no luck: secedit /configure /cfg %windir%\repair\secsetup.inf /db secsetup.sdb /verbose EDIT:

    Read the article

  • Real-time Image Resize, Cropping and Caching Server Product

    - by Elijah
    I'm investigating what products are out there that will allow you to request images through a HTTP API in arbitrary image sizes. The server would behind a CDN but would still need to be able to handle a fair bit of traffic and be possibly load-balanced. I've been tasked with writing such a service, but I wanted to do some due diligence to see what commercial or open source solutions are out there. Google has not been particularly helpful. It may be because I have been searching for the wrong term. Third-party sites and services are out of the question because of corporate policies.

    Read the article

  • Remote I/O costs with a Content Delivery Network

    - by x711Li
    As far as I know, the time complexity of scanning a directory and the amount of files in said directory are correlated due to I/O costs. Would the administrative costs of placing the files in a hashed directory tree for uploading/downloading files through a CDN API be worth it for the added efficiency? For instance, given a filename foo.mp3, the MD5 hash for this is 10ebb1120767e9de166e0f5905077cb1. Thus, storing foo.mp3 in ./10/eb/foo.mp3 would allow for less files per directory (assuming MD5 generates patterns with in Base36, this allows for 36^2 root directories with 36^2 subdirectories each and little chance of hash collision) Considering the directories themselves are not loaded, would the I/O costs of directory scanning still exist with direct uploading/downloading?

    Read the article

  • How to migrate Lotus Notes Mail in database to Exchange public folder?

    - by elsni
    I need to migrate a Lotus Notes Mail database to Exchange. In the past, Users get mails in their Notes mail accounts and sort them manually by drag and drop in a specific folder structure in a seperate Notes mail database. This should be mapped to Exchange. I considered using outlook and a sharepoint Library mapped to outlook, but Outlook does not support drag and drop from the mail account to a standard doclib and the discussion libs do not support folders. So I think it's the easiest way to use an Exchange public folder instead of a sharepoint lib, which should work as in Notes (correct me if I'm wrong). But how I do migrate the old Notes db to the public folder including all subfolders? Thank you!

    Read the article

  • Advice on off-site backup of Hyper-V Failover Cluster

    - by Paul McCowat
    We are currently setting up a Server 2008 R2 which will be off-site over a leased line with VPN. At the main site is 2 x Hyper-V hosts in a failover cluster with PowerVault M3000i iSCSI SAN. We are using BackupAssist for local backups and each host backups up itself and it's guests nightly creating a 500GB backup each which is copied to a 2TB rotated NAS drive. Files and SQL DB's are also backed up / log shipped etc. Looking for the best way to backup the Hyper-V VM's and copy them off-site so that the OS's are only a month old and the data is a day old. The main backups are too large to transfer between backups so options discussed so far are: Take rotating individual backups of the VM's each day and copy over, Day 1 SQL VM, Day 2 Exchange VM etc, would require more storage. Look in to Hyper-V snapshots, however don't believe these are supported in clustering. 3rd party replication tools

    Read the article

  • Domain name in server other than the website

    - by med
    Hello! I'm not very popular with hosting, but I have a special situation: I live in Tunisia, I can buy a domain name with .tn extention, but the problem is that: 1- The domain could not be pointed to a server outside Tunisia 2- All servers in Tunisia are bad, no one provides really reliable hosting so, I want to use the .tn domain with a basic hosting in Tunisia, and Make other db queries and rich media on another remote server outside Tunisia. How to do it? Is there better alternatives? All suggestions are welcome :) Thank you.

    Read the article

  • innodb memory usage mysql

    - by Tiddo
    I have a small vps, with only 256mb of ram, with maximum burst up to 512mb. When I configure my vps without innodb, it only uses 130 mb of ram, so that is no problem for me. But when I turn on innodb, The memory usage grows to about 300-400 mb. Is it possible to run innodb such that I won't exceed the 256mb? preferably I don't want to use more than 100mb for innodb. I already came across some sites which said I could limit the memory usage, but if I limit it to only 100mb will the db run well enough? (compared to for example the MyISAM storage engine) If 100mb is to little memory for innodb, can you recommend me any other storage engine which supports transactions?

    Read the article

  • Fixing Broken Groups

    - by themaestro
    Hey, I just got onto a new project with the student government at my University and we're trying to get our webserver into a more workable state. The current problem is that all of us for some reason have sudo power on the server, but we can't write/create files anywhere on the server (as far as we can tell) currently. Our groups are currently as follows: /srv/ice/db$ groups goshri sshamim rmenezes goshri : goshri sshamim : sshamim ptx rmenezes : rmenezes ptx daifotis : daifotis ptx We added a few of us to ptx because we thought that might give us write access but it didn't. We have a bunch of webapps running on this server but since it's university things change hands quickly. What can we do to give us read access?

    Read the article

  • scrape data from a website and post it on the blog (wordpress)

    - by Pennf0lio
    This could be in DocType But I'm looking for a software or just a plugin for wordpress. I wanted to fetch those data from a website and automatically post it on my blog (Wordpress powered). It doesn't have rss or api to get those data, so I need to manually copy and paste it one-by-one and post it on wordpress. Do you know an alternative options on my process? or you know a software or a plugin that does the job? Thanks!

    Read the article

  • Sharing / replicating EBS across AWS nodes

    - by skrat
    I would like to use single EBS storage across multiple EC2 nodes (web/app servers). I've read some articles on snapshot sharing, but that doesn't suit well for what we need. We use filesystem for storing DB record attachments, so if one such attachment gets created, we need it to be immediately available to all nodes (to serve). So far only NFS seem to be viable, but it's a pain to configure and maintain. Another option could be storing those attachments on S3 instead, but that would cut us of doing any analysis on that data. This must be quite common problem when scaling in AWS, what solutions are there?

    Read the article

  • What could cause a file system to spontaneously unmount or become invalid for a short time?

    - by Ichorus
    We've got DB2 LUW running on a RHEL box. We had a crash of DB2 and IBM came back and said that a file that DB2 was trying to access (through open64()) unmounted or became invalid. We have done nothing but restart the database and things seem to be running fine. Also, the file in question looks perfectly normal now: $ cd /db/log/TEAMS/tmsinst/NODE0000/TEAMS/T0000000/ $ ls -l total 557604 -rw------- 1 tmsinst tmsinst 570425344 Jan 14 10:24 C0000000.CAT $ file C0000000.CAT C0000000.CAT: data $ lsattr C0000000.CAT ------------- C0000000.CAT $ ls -l total 557604 -rw------- 1 tmsinst tmsinst 570425344 Jan 14 10:24 C0000000.CAT With those facts in hand (please correct me if I am mis-interpreting the data at hand) what could cause a file system to 'spontaneously unmount or become invalid for a short time'? What should my next step be? This is on Dell hardware and we ran their diagnostic tools against the hardware and it came back clean.

    Read the article

  • Using openssl command line tool to encrypt/decrypt data, DES ECB

    - by smsrecv
    Hello How can I create a random 64 bit key for DEC ECB encryption/decryption, and then use the same key for encryption/decryption many times? All this must be done using openssl command line tool. In all the examples I have seen, they do not use a "key", they use "password". But I need a key - array of bytes - because I need to send it to the other party (I don't know which API they use for cryptography.) Then I need to use this key - array of bytes - to encrypt/decrypt data. Thnak you

    Read the article

< Previous Page | 550 551 552 553 554 555 556 557 558 559 560 561  | Next Page >