Search Results

Search found 7957 results on 319 pages for 'production databases'.

Page 166/319 | < Previous Page | 162 163 164 165 166 167 168 169 170 171 172 173  | Next Page >

  • How can I fix puppet refusing to start and asking for "master.pp"?

    - by cwd
    I'm using the very latest version of puppet and have been following the Apress "Pro Puppet" guide step by step. I have installed puppet sudo aptitude install ruby libshadow-ruby1.8 sudo aptitude install puppet puppetmaster facter I have edited /etc/puppet/puppet.conf to include certname [master] certname=puppet.mydomain.com I have edited /etc/hosts and added the following line 127.0.0.1 puppet.mydomain.com puppet I have set the hostname of the server echo "puppet.mydomain.com" > /etc/hostname hostname -F /etc/hostname And then I try and run puppet from the command line. puppet master --verbose --no-daemonize And puppet gives me this error: Could not parse for environment production: Could not find file /master.pp I'm running all commands with sudo and the last line of the error message always says that it can't find master.pp and the path before it is to my current working directory. What am I doing wrong? I should also mention that I don't have a DNS record set up for puppet.mydomain.com - I saw some online documentation mentioning this might be a problem - however I was fairly sure that the hosts file would let me get around that.

    Read the article

  • Starting Redmine at boot using upstart on Ubuntu

    - by joekr
    So after installing redmine from the repositories on Ubuntu 12.04, i've tried to create an upstart script (see below) so it would start up at boot time. While service redmine start does start redmine, it does not start automatically when booting. Also when i run service redmine stop it does stop redmine but simply hangs until i press ctrl+c Also, after stopping starting it again does not work (Also just hangs). From all the upstart examples and tutorials i have seen this should work, so i guess i'm overlooking something. Any hints ? #Redmine description "Redmine" start on started networking stop on stopping networking stop on starting shutdown console output expect daemon exec ruby /usr/share/redmine/script/server webrick -e production -b 188.40.184.155 -p 3000 -d respawn EDIT: Fixed "typo"

    Read the article

  • Database modularity with EBS volumes

    - by Eclyps19
    I would like to add modularity to my websites on EC2 instances by encapsulating the site files and the mysql files in their own EBS volumes. The end result that I'm going for is the ability to quickly mount a volume or two to different servers running the same AMI (for testing/development/emergency maintenance, etc), as well as maintain separate snapshots of each. I'm able to do this fairly easily with a single database by symlinking my mounted database EBS to the appropriate places (/var/lib/mysql, /etc/my.cnf, /var/log/mysqld.log), but I'm not sure if it would even be possible be possible to have multiple databases on different EBS volumes running concurrently. Example: /website1/www.website.com /database1/ /website2/www.otherwebsite.com /database2/ Could anybody shed some light on this for me? Is it possible? Is it a bad idea? Thanks.

    Read the article

  • Command line import of database using latin1 encoding

    - by chrisjlee
    I'm using a particular cloud hosting solution (one which i won't name) and they don't provide ssh access so i'm at a whim on how the database is dumped. I downloaded the dump which is packed into a tar.gz file. I discover that this file utilizes latin1 encoding. Which i don't get to specify the encoding for the host i'm using because i don't have SSH access or DB access. I try to import it via command line for my local development environment (mysql -uroot foodb < file.db) like i usually do with other databases but am having problems. Is it possible to import a database via command line by specifying which encoding (preferably latin1) before importing it? Or do i have to convert it to UTF8?

    Read the article

  • Migrating master-slave MySQL database servers to 2 new servers, any tips or suggestions?

    - by mmattax
    I'm setting up 2 new database servers that will be replacing a current master-slave setup. All boxes are running / will be running MySQL on RHEL. Our current naming conventions: db1 - master database db2 - slave (using MySQL replication) db01 - new master db02 - new slave We need to get db01 to be the new master with db02 as the new slave. What is the best way to migrate db1 and db2 to db01 and db02? db1 and db2 are running in a production setting and we need to minimize all downtime; db1 has roughly 30GB of data in the database. Any suggestions or tips on how to migrate to our new servers would be much appreciated.

    Read the article

  • "This computer has dynamically assigned IP addresses" error when installing Active Directory Domain Controller

    - by smhnaji
    This is a working Windows Server 2008 that I should install Active Directory on it. I found http://www.howtogeek.com/99323/ and followed the steps. After Additional Domain Controller Options, I'm asked the question "This computer has dynamically assigned IP addresses". As I see, the message states that Dynamic IP addressing has been used for the server, while this is wrong. When I come to Network And Sharing Center, and click on Local Area Connections - Properties - Internet Protocol Version 4 (TCP/IPv4) - Properties, I see that the main IP address (as well as DNS Server) and also all other IP addresses are assigned statically. So it should be OK. I cannot believe any server using dynamic IP(s)! Note: No IPv6 has been set for the server. Please tell me why the error is given and which of the options available, should I choose? Note that it's a production server and is working with many users in WORKGROUP. No change should be affected nor to the IPs, neither to users connecting to the server.

    Read the article

  • Installing gitlab on Debian 6.0.5

    - by helmus
    I am using following directions in an attempt to install gitlab on Debian 6.0.5 https://github.com/gitlabhq/gitlabhq/blob/stable/doc/installation.md I am getting an error when i'm running following command sudo -u gitlab bundle exec rake gitlab:app:setup RAILS_ENV=production WARNING: #<ArgumentError: Illformed requirement ["#<Syck::DefaultKey:0x00000004b52198> 1.1.4"]> # -*- encoding: utf-8 -*- Gem::Specification.new do |s| s.name = %q{carrierwave} s.version = "0.6.2" s.required_rubygems_version = Gem::Requirement.new(">= 0") if s.respond_to? :required_rubygems_version= s.authors = ["Jonas Nicklas"] ....more error.... s.add_dependency(%q<mini_magick>, [">= 0"]) s.add_dependency(%q<rmagick>, [">= 0"]) end end WARNING: Invalid .gemspec format in '/usr/local/lib/ruby/gems/1.9.1/specifications/carrierwave-0.6.2.gemspec' Could not locate Gemfile Some pointers to what could cause this would be much appreciated, i have only little experience with RoR and it seems to be related to that.

    Read the article

  • How does geolocation based on IP address work?

    - by Martin
    As all Internet users, I've visited web sites which appear to know in which country and city I'm located. I understand that these web sites typically look up my IP address in a database which maps IP address to country / city which works fairly well. I've also seen companies selling this type of database. How is this database, which maps an IP address to a country / city, created in the first place? Is there a central database somewhere where each ISP registers the link between IP address and country/city? Or does the companies selling geolocation databases contact different ISP's and purchase the mapping information from them? Or is there some organization 'above' ISP's who keeps track of this?

    Read the article

  • Public folder emails not being delivered

    - by Rob
    Hello, We have just introduced an Exchange 2010 installation into our existing Exchange 2003 (all standard) environment. We make a lot of use of our Public Folders in 2003, so I am wanting to make a small PF tree in the 2010 system to test some applications against. I have created a few public folders in the 2010 public folder management tool, and mail enabled them, gotten email addresses, etc. However, mail will not be delivered, it queues on my existing 2003 Exchange server's 'Local Delivery' queue, and eventually times out and bounces. I guess the Exchange 'system' including the new 2010 server thinks that all public folder email must need to be delivered to the old 2003 server. Is it possible for me to have two public folder databases that each receive mail? If so, is there something I am missing to enable this? Thanks -R

    Read the article

  • Is there a simple way to backup and restore all Microsoft SQL Server database objects related to a p

    - by Nathan Hartley
    I would like to backup, not only the databases that belong to a particular application living on a shared server, but also, those things that get stored outside of the database; the server accounts, jobs, maintenance plans and whatever else I can't think of at the moment. This backup should be complete enough that it's corresponding restore will recreate the entire application on a different SQL server. This seems like a problem others must have dealt with in the past. So before I embark on creating custom Powershell scripts for each application, I have come to ask you... Can you help?

    Read the article

  • Is it easy to update ubuntu beta to ubuntu final release?

    - by Peter Smit
    At this moment I am preparing a virtual server to host a web application which needs php5.3 The virtual server base image is always Hardy (8.04 LTS). There is no php5.3 until the upcoming release in a few days: Lucid (9.04 LTS). I am seeing to options: - waiting until the final version is released and then start preparing this server - Now upgrading to the beta (do-release-upgrade --devel-release) and when the final release has come upgrading to that For time constraints I would prefer the second option. I only can't find whether it will be easy to upgrade from a beta to the 'clean' final release. Is this possible in an easy way. Will it have any drawback for security or will there be any traces left of it being ever a beta release? Note: the server will not go into production before the LTS is really installed.

    Read the article

  • How can I do an "insert" statement from MS SQL -> MYSQL using linked servers

    - by bvandrunen
    Is it possible to do an "insert" statement through linked servers. I know it is possible by using MSDTC...but does this work between MS SQL and MYSQL? Any help would be greatly appreciated. As of right now...I can update and select between the 2 databases but it gives me an error when I try to run an insert statement. OLE DB provider "MSDASQL" for linked server "******" returned message "Query-based insertion or updating of BLOB values is not supported.". Msg 7343, Level 16, State 2, Line 1 The OLE DB provider "MSDASQL" for linked server "******" could not INSERT INTO table "[*********]...[******_options]". Location: memilb.cpp:1493 Expression: (*ppilb)-m_cRef == 0 SPID: 76 Process ID: 1644

    Read the article

  • Weird unexpected image compression on a web server running Apache on Ubuntu?

    - by Billy Bob Thornton
    I have a weird problem on my production web server running Apache on Ubuntu: it compresses my images thereby dramatically lowering their quality! Actually I have two virtual hosts running, each located in a different folder. Wether I display .gif images by navigating on the two sites, or acceding them directly by their url, their size and quality are invariably degraded. I tried with three different browsers: same problem. Using them on other sites on the Web: no problem. Of course I disabled mod_deflate on the server (which should not compress images anyway), but the phenomenon remains. On my local développement server, running the same configuration, everything is Ok. Now I'm completely lost! For the record, my configuration: Ubuntu 10.04, Apache 2, Php 5.

    Read the article

  • Portable scripting language for a multi-server admin?

    - by Aaron
    Please Note: Portable as in portableapps.com, not the traditional definition. Originally posted on stackoverflow.com, asking here at another user's suggestion. I'm a DBA and sysadmin, mostly for Windows machines running SQL Server. I'm looking for a programming/scripting language for Windows that doesn't require Admin access or an installer, needing no install process other than expanding it into a folder. My intent is to have a language for automation on which I can standardize. Up to this point, I've been using a combination of batch files and Unix shell, using sh.exe from UnxUtils but it's far from a perfect solution. I've evaluated a handful of options, all of them have at least one serious shortcoming or another. I have a strong preference for something open source or dual license, but I'm more interested in finding the right tool than anything else. Not interested that anything that relies on Cygwin or Java, but at this point I'd be fine with something that needs .NET. Requirements: Manageable footprint (1-100 files, under 30 MB installed) Run on Windows XP and Server (2003+) No installer (exe, msi) Works with external pipes, processes, and files Support for MS SQL Server or ODBC connections Bonus Points: Open Source FFI for calling functions in native DLLs GUI support (native or gtk, wx, fltk, etc) Linux, AIX, and/or OS X support Dynamic, object oriented and/or functional, interpreted or bytecode compiled; interactive development Able to package or compile scripts into executables So far I've tried: Ruby: 148 MB on disk, 23000 files Portable Python: 54 MB on disk, 2800 files Strawberry Perl: 123 MB on disk, 3600 files REBOL: Great, except closed source and no MSSQL or ODBC in free version Squeak Smalltalk: Great, except poor support for scripting ---- cut: points of clarification ---- Why all the limitations? I realize some of my criteria seem arbitrarily confining. It's primarily a product my environment. I work as a SQL Server DBA and backup Unix admin at a division of a large company. In addition to near a hundred boxes running some version or another of SQL Server on Windows, I also support the SQL Server Express Edition installs on over a thousand machines in the field. Because of our security policies, I don't login rights on every machine. Often enough, an issue comes up and I'm given local Admin for some period of time. Often enough, it's some box I've never touched and don't have my own environment setup yet. I may have temporary admin rights on the box, but I'm not the admin for the machine- I'm just the DBA. I've no interest in stepping on the toes of the Windows admins, nor do I want to take over any of their duties. If I bring up "installing" something, suddenly it becomes a matter of interest for Production Control and the Windows admins; if I'm copying up a script, no one minds. The distinction may not mean much to the readers, but if someone gets the wrong idea I've suddenly got a long wait and significant overhead before I can get the tool installed and get the problem solved. That's why I want something that can be copied and run in the manner of a portable app. What about the small footprint? My company has three divisions, each in a different geographical location, and one of them is a new acquisition. We have different production control/security policies in each division. I support our MSSQL databases in all three divisions. The field machines are spread around the US, sometimes connecting to the VPN over very slow links. Installing Ruby \using psexec has taken a long time over these connections. In these instances, the bigger time waster seems to be archives with thousands and thousands of files rather than their sheer size. You could say I'm spoiled by Unix, where the admins usually have at least some modern scripting language installed; I'd use PowerShell, but I don't know it well and more importantly it isn't everywhere I need to work. It's a regular occurrence that I need to write, deploy and execute some script on short notice on some machine I've never on which logged in. Since having Ruby or something similar installed on every machine I'll ever need to touch is effectively impossible because of the approvals, time and and Windows admin labor needed I makes more sense find a solution that allows me to work on my own terms.

    Read the article

  • DataTable to JSON

    - by Joel Coehoorn
    I recently needed to serialize a datatable to JSON. Where I'm at we're still on .Net 2.0, so I can't use the JSON serializer in .Net 3.5. I figured this must have been done before, so I went looking online and found a number of different options. Some of them depend on an additional library, which I would have a hard time pushing through here. Others require first converting to List<Dictionary<>>, which seemed a little awkward and needless. Another treated all values like a string. For one reason or another I couldn't really get behind any of them, so I decided to roll my own, which is posted below. As you can see from reading the //TODO comments, it's incomplete in a few places. This code is already in production here, so it does "work" in the basic sense. The places where it's incomplete are places where we know our production data won't currently hit it (no timespans or byte arrays in the db). The reason I'm posting here is that I feel like this can be a little better, and I'd like help finishing and improving this code. Any input welcome. public static class JSONHelper { public static string FromDataTable(DataTable dt) { string rowDelimiter = ""; StringBuilder result = new StringBuilder("["); foreach (DataRow row in dt.Rows) { result.Append(rowDelimiter); result.Append(FromDataRow(row)); rowDelimiter = ","; } result.Append("]"); return result.ToString(); } public static string FromDataRow(DataRow row) { DataColumnCollection cols = row.Table.Columns; string colDelimiter = ""; StringBuilder result = new StringBuilder("{"); for (int i = 0; i < cols.Count; i++) { // use index rather than foreach, so we can use the index for both the row and cols collection result.Append(colDelimiter).Append("\"") .Append(cols[i].ColumnName).Append("\":") .Append(JSONValueFromDataRowObject(row[i], cols[i].DataType)); colDelimiter = ","; } result.Append("}"); return result.ToString(); } // possible types: // http://msdn.microsoft.com/en-us/library/system.data.datacolumn.datatype(VS.80).aspx private static Type[] numeric = new Type[] {typeof(byte), typeof(decimal), typeof(double), typeof(Int16), typeof(Int32), typeof(SByte), typeof(Single), typeof(UInt16), typeof(UInt32), typeof(UInt64)}; // I don't want to rebuild this value for every date cell in the table private static long EpochTicks = new DateTime(1970, 1, 1).Ticks; private static string JSONValueFromDataRowObject(object value, Type DataType) { // null if (value == DBNull.Value) return "null"; // numeric if (Array.IndexOf(numeric, DataType) > -1) return value.ToString(); // TODO: eventually want to use a stricter format // boolean if (DataType == typeof(bool)) return ((bool)value) ? "true" : "false"; // date -- see http://weblogs.asp.net/bleroy/archive/2008/01/18/dates-and-json.aspx if (DataType == typeof(DateTime)) return "\"\\/Date(" + new TimeSpan(((DateTime)value).ToUniversalTime().Ticks - EpochTicks).TotalMilliseconds.ToString() + ")\\/\""; // TODO: add Timespan support // TODO: add Byte[] support //TODO: this would be _much_ faster with a state machine // string/char return "\"" + value.ToString().Replace(@"\", @"\\").Replace(Environment.NewLine, @"\n").Replace("\"", @"\""") + "\""; } }

    Read the article

  • I need a server or service that reroutes DNS requests

    - by Relentim
    We have two external servers, Dev and Prod. We are a software house and in the code we have a subdomain metrics.company.com that points to Prod. Development is continuous and our internal and external developers and testers will need to switch from Dev to Prod and back again. It is not an option to have a different sub domain in the code during development and change this for production. The way we wish to switch between Dev and Prod is to use DNS. We need a public DNS server that behaves normally apart from routing metrics.company.com to Dev. The users will be able to swap their DNS back and forward to hit the different servers. What is the easiest way to do this? Is there a company that hosts this service or am I going to have to rent a server and set it up myself? Any help would be much appreciated.

    Read the article

  • Windows 2003, MySQL 5.0.24, Windows Updates causes problems with MySQL?

    - by Alessandro
    Hi, our Windows 2003 webserver has 2 GB of ram and MySQL v. 5.0.24-community-nt. Maybe due to Windows Updates (is it possible??) I've problems with MySQL databases. I should restart everyday IIS services. Events: "Changed limits: max_open_files: 2048 max_connections: 800 table_cache: 619" "Do you already have another mysqld server running on port: 3306 ?" "Can't connect to MySQL server on 'localhost'" 10055 Should I increase the innodb_buffer_pool_size, from 250M to 500M? Or/and? Thanks

    Read the article

  • Experiences using VLC for video-on-demand streaming? (VLM)

    - by StackedCrooked
    I'm considering my options for implementing a VOD service. Until recently my choices seemed to be either Wowza or Darwin, but now I discovered VLM, which looks really cool. I am going to stream MPEG4 H.264 video with AAC audio. I'm probably going to use the RTSP protocol, but I'm willing to use HTTP as well (after reading this article). Can anyone comment on his or her experiences with VLM? How does it compare to Darwin or Wowza? Is it stable and worthy of production use? Are there any limitations or performance problems?

    Read the article

  • What do you use to loadbalance IPv6 services?

    - by Michael Renner
    Hi, the current Linux software environment for IPv6 load balancing looks a bit grim. IPVS has rudimentary support for IPv6 but it's far from complete. NAT for IPv6 seems to be a no-go. Are there any other projects which aim for this goal? Does the IPv6 support in other OS look better? Are there any commercial products which have been successfully used in production environments with non-trivial load patterns? Or is it just that the time for IPv6 hasn't come... yet? ;) best regards, Michael

    Read the article

  • SQL Server 2008 Replication Promotion

    - by Stefan Mai
    I have a 4 node cluster, 1 subscriber and 3 publishers, all running SQL Server 2008 R2 Enterprise. The intention is that if the subscriber goes down, we can use one of the publishers to quickly build up its replacement. Our testing reveals a problem though: the subcriber databases all have Not For Replication set to Yes on the identity columns so that they can maintain the identity set in the subscriber. This causes a problem when they become subscribers because now we don't have identity insert functionality: we get a primary key error. Any way to "promote" a subscriber to publisher?

    Read the article

  • Most awesomely bad hack

    - by Zypher
    As I sit watching one of my latest dirty dirty hacks run, I started wondering what kind of dirty hacks you have created that are so bad they are awesome. We all have a few of them in our past - and they are probably still running in production somewhere, chugging along somehow still working. Which reminds me of the hack we had to put into place when we were moving data centers. Our IVRs had to keep running, as the data center we were moving from was the primary DC, and the new Primary wasn't quite ready to take traffic. So what do we do. Well we answer the calls in DC1, then ship the sip stream over the internet to DC2 1900 miles away ... that just felt oh so wrong. So the question is, what is one (or more) of your awesomely bad hacks?

    Read the article

  • install PECL JSON in PHP 5.04

    - by Radu Maris
    It's OK (compatible) to install PECL native JSON (from here) in PHP 5.0.4, on a production server running FC4 where unfortunately I cannot update PHP to at least 5.2 ? If there is a good chance to screw up PHP instalation on the server, I will not try to install it, and I will stick to Service JSON ( http ://svn.php.net/viewvc/pear/packages/Services_JSON/trunk/ ) In documentation ( http ://aurore.net/projects/php-json/ ) I have found: A simple ./configure; make; make install should do the trick. Make sure to add an extension=json.so line to your php.ini/php.d. (but I can't find anythink about compatible versions of PHP) Thank you. (Please don't tell me to update the OS and PHP, beacause it's not my decision :( )

    Read the article

  • MySQL query very slow on Amazon RDS but really fast on my laptop?

    - by Luc
    I would love to know if anybody knows why this is happening. i've just migrated over to Amazon RDS for our website and our biggest query which takes .2 seconds to execute on my macbook takes 1.3 seconds to execute on the most expensive RDS instance. Obviously i've disabled query cache (and tested this) on my local computer and both databases are exactly the same. InnoDB, both have the same indexes etc. It's costing us a fortune ($2000 per month) for the fastest RDS instance and i'm losing faith quickly. any ideas?

    Read the article

  • Windows Experience Index Dropped After Adding Dedicated Graphics Card

    - by Ludo
    I purchased a new PC with a Gigabye Z68X-UD3H-B3. I had a Radeon HD 5450 graphics card spare, so I've added that instead of using the onboard graphics as I just presumed it would be better. But, my windows experience index has gone down. From 5.4 to 5. Dekstop Performance has dropped from 5.4 to 5, although gaming graphics has gone up from 5.9 to 6.2. I'm not actually going to be using the machine for gaming, just audio production, but I added the card as I'll possibly be doing video editing in future too. Why would this be? Can I trust windows experience index scores? Or is it possible the onboard stuff is just better for general desktop stuff? Thanks! Ludo.

    Read the article

  • Commercial version of Freenet6

    - by grnbeagle
    I've been using Freenet6 from gogoNET to make my mobile device publicly accessible via IPv6. It works quite well except that the service is not as stable because it's non-commercial usage only as their servers are hosted for free by different operators around the world. Apparently gogoNET sells hardware called gogoSERVER which allows one to build a service similar to Freenet6. I've inquired their sales team, but they were unable to tell me which companies have a commercial, production-quality implementation of Freenet6. Specifically I'm looking for the following features in IPv6 service provider: Client-based IPv6 connectivity for mobile devices: gogoCLIENT (gw6c) is ideal for mobile devices since it allows a device to go online regardless of the device's location. API for account maintenance: so that we can create device accounts from our software Static IPv6 address: (or maybe I mean IPv4 address) by this I mean, just like gogoNET6 service (username.broker.freenet6.net), we want to provide our users with a permanent URL for their device Any info on commercial IPv6 service provider utilizing gogoSERVER is appreciated. Thanks.

    Read the article

< Previous Page | 162 163 164 165 166 167 168 169 170 171 172 173  | Next Page >