Search Results

Search found 85708 results on 3429 pages for 'new sysadmin'.

Page 583/3429 | < Previous Page | 579 580 581 582 583 584 585 586 587 588 589 590  | Next Page >

  • Migrating Magento Concern

    - by Pankaj Upadhyay
    We have a Magento 1.5.0.1 store running at a hosting provider. Now, we need to migrate the same from that server to a new hosting provider. I had talk with a technical guy from the new hosting provider who told me to do following things. Go into the cPanel Backup Wizard . Make a FULL BACKUP and download the zip file Then upload that zip file on their server in my root folder. Then tell them and they will do the restore. My Concern :- Will everything work as expected. What about the connectionstrings and database and all. Will database be automatically created and work the same. Also, somewhere I read that ver 1.5.0.1 used older type of database which might not work on new MySQLs. Can this too have any impact. Should i proceed in the same manner or I need to take care of some additional things to ensure smooth running.

    Read the article

  • DVI monitor detected only on computer startup

    - by kamil
    I've recently connected a new monitor, LG M2252D-PZ, to a rather outdated computer with Windows XP and Radeon 9600. XP has SP3 installed, video drivers are the latest version back from the times the video card was still supported. My problem is that the monitor works fine only as long as I don't turn it off or switch it to a different input. When I turn it back on, it says "no signal". The key to the problem must be the DVI port, to which the new monitor is connected. The previous monitor was connected to the VGA output, and I've tested that the new one also works fine when connected to the analogue port. Apparently, the computer tests for the presence of a monitor on the DVI port only on startup. The question is, how do I change this?

    Read the article

  • AS2 Server Software Costs

    - by CandyCo
    We're currently using Cleo LexiCom as our server software for receiving EDI transmissions via the AS2 protocol. We have 7 trading partners per year, and this runs us about $800/year for support from Cleo. We need to expand from 7 trading partners to 10 or so, and Cleo charges roughly $600 per new host, plus an expanded yearly support fee. My question(s) are: Does anyone know of a cheaper developer of AS2 server software, and perhaps one that doesn't charge per new host? Does anyone have any clue why we are being charged an upfront fee for new hosts, and if this is a standard practice for AS2 software providers? It seems really odd that we are required to pay upfront costs for this. I could completely understand an increase in the yearly support, however.

    Read the article

  • Convert text to table

    - by Quattro
    I would like convert text into a table. Here is a link to the text http://www.tcdb.org/public/tcdb Short example: >gnl|TC-DB|A0CIB0|1.A.17.3.1 Chromosome undetermined scaffold_19, whole genome shotgun sequence OS=Paramecium tetraurelia GN=GSPATT00007662001 PE=4 SV=1 MDDQNQPILQEQPKPKQKKPLLNTKMVKKQKMQNKKEENLREILNFYTNQVDARKFLQKM KAVVDSNQQEKKYQDDFLNPNEYNEMQDIYEDYNMGDLVIVFPNPDADGVKNPPITYKEA PLTKTNFYSKIGNVSYENDIDELCVDEMEYLRNMRNVDGEHMDQDHVKEEI >gnl|TC-DB|A0CS82|9.B.82.1.5 Chromosome undetermined scaffold_26, whole genome shotgun sequence - Paramecium tetraurelia. MIIEEQIEEKMIYKAIHRVKVNYQKKIDRYILYKKSRWFFNLLLMLLYAYRIQNIGGFYI VTYIYCVYQLQLLIDYFTPLGLPPVNLEDEEEDDDQFQNDFSELPTTLSNKNELNDKEFR PLLRTTSEFKVWQKSVFSVIFAYFCTYIPIWDIPVYWPFLFCYFFVIVGMSIRKYIKHMK KYGYTILDFTKKK I wanted to have columns for example delimited with pipe | or ; |>gnl|TC-DB|A0CIB0|1.A.17.3.1| Chromosome undetermined scaffold_19, whole genome shotgun sequence OS=Paramecium tetraurelia GN=GSPATT00007662001 PE=4 SV=1| MDDQNQPILQEQPKPKQKKPLLNTKMVKKQKMQNKKEENLREILNFYTNQVDARKFLQKM KAVVDSNQQEKKYQDDFLNPNEYNEMQDIYEDYNMGDLVIVFPNPDADGVKNPPITYKEA PLTKTNFYSKIGNVSYENDIDELCVDEMEYLRNMRNVDGEHMDQDHVKEEI I am working with Windows and I don't know how to do it I just know every row starts with > I want to substitute the first whitespace in a row with a delimiter like | or ; after the first regular expression new line in a row, I want also a delimiter everything between the regular expression first new line and > should go into a new column (it's a sequence of a protein)

    Read the article

  • Move and clone VirtualBox machines with filesystem commands

    - by mit
    I know of 2 ways to clone a VirtualBox machine on a linux host, one is by using the VirtualBox gui and exporting and re-importing as Appliance (in the file menu of VirtualBox). The other is by cloning only the virtual disk containers: VBoxManage clonevdi source.vdi target.vdi (Taken from http://forums.virtualbox.org/viewtopic.php?p=853#p858 ) I would have to create a new VM afterwards and use the cloned virtual disk. Is there a way I can just copy a virtual disk and the and do the rest by hand? I'd have to manually edit the ~/VirtualBox/VirtualBox.xml and insert a new disk and a new machine: Can I just make up UUIDs or how would this work? I would very much prefer this hardcore method of doing things as it allows me to freely and rapdily backup, restore, move or clone machines. Or ist there a better way to do this?

    Read the article

  • App Pool crashes before loading mscorsvr. How to troubleshoot?

    - by codepoke
    I have an app pool that recycles every 29 hours, per default. It recycles smoothly 9 times out of 10, and I'm pretty sure the recycle itself is good for the app. Once every couple weeks the recycle does not work. The old worker process dies cleanly and the new worker process starts, but will not serve up content. Recycling the app pool again manually works like a charm. The failed worker process stops and dies cleanly and a second new worker process fires up and serves content perfectly. I took a crash dump against the failed worker process prior to recycling it, and DebugDiag found nothing to complain about. I tried to dig a little deeper using WinDBG, but mscorsvr/mscorwks is not loaded yet 15 minutes after the new process started. There are 14 threads running (4 async) and 20 pending client connections, but .NET is not even loaded into the process yet. Any suggestions where to poke and prod to find a root cause on this?

    Read the article

  • Migrate servers without losing any data / time-limited MySQL dump?

    - by inac
    Is there a way to migrate from an old dedicated server to a new one without losing any data in-between - and with no downtime? In the past, I've had to lose MySQL data between the time when the new server goes up (i.e., all files transferred, system up and ready), and when I take the old server down (data still transferred to old until new one takes over). There is also a short period where both are down for DNS, etc., to refresh. Is there a way for MySQL/root to easily transfer all data that was updated/inserted between a certain time frame?

    Read the article

  • Transfer disk image to larger/smaller disk

    - by forthrin
    I need to switch the hard drive on a 2006 iMac to a new SSD. I don't have the original installation CDs. I know I can order CDs from Apple, but this costs money. Someone told me it's possible to rip the image of the old drive and transfer to the new drive. If so, does the size of the new drive have to be exactly the same as the old? If not, my questions are: Is it possible to "stretch" the image from 120 MB disk to a 256 MB disk (numbers are examples)? If so, what is the command line for this? Likewise, is it possible to "shrink" an image from a larger disk (eg. 256 MB) to a smaller disk (eg. 120 MB), provided that the actual space used on the disk does not exceed 120 MB? How do you do this on the command line?

    Read the article

  • Enable group policy for everything but the SBS?

    - by Jerry Dodge
    I have created a new group policy to disable IPv6 on all machines. There is only the one default OU, no special configuration. However, this policy shall not apply to the SBS its self (nor the other DC at another location on a different subnet) because those machines do depend on IPv6. All the rest do not. I did see a recommendation to create a new OU and put that machine under it, but many other comments say that is extremely messy and not recommended - makes it high maintenance when it comes to changing other group policies. How can I apply this single group policy to every machine except for the domain controllers? PS - Yes, I understand IPv6 will soon be the new standard, but until then, we have no intention to implement it, and it in fact is causing us many issues when enabled.

    Read the article

  • Apply rewrite rule for all but all the files (recursive) in a subdirectory?

    - by user784637
    I have an .htaccess file in the root of the website that looks like this RewriteRule ^some-blog-post-title/ http://website/read/flowers/a-new-title-for-this-post/ [R=301,L] RewriteRule ^some-blog-post-title2/ http://website/read/flowers/a-new-title-for-this-post2/ [R=301,L] <IfModule mod_rewrite.c> RewriteEngine On ## Redirects for all pages except for files in wp-content to website/read RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} !/wp-content RewriteRule ^(.*)$ http://website/read/$1 [L,QSA] #RewriteRule ^http://website/read [R=301,L] RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> My intent is to redirect people to the new blog post location if they propose one of those special blog posts. If that's not the case then they should be redirected to http://website.com/read. Nothing from http://website.com/wp-content/* should be redirected. So far conditions 1 and 3 are being met. How can I meet condition 2?

    Read the article

  • How do I split a large MySql backup file into multiple files?

    - by Brian T Hannan
    I have a 250 MB backup SQL file but the limit on the new hosting is only 100 MB ... Is there a program that let's you split an SQL file into multiple SQL files? It seems like people are answering the wrong question ... so I will clarify more: I ONLY have the 250 MB file and only have the new hosting using phpMyAdmin which currently has no data in the database. I need to take the 250 MB file and upload it to the new host but there is a 100 MB SQL backup file upload size limit. I simply need to take one file that is too large and split it out into multiple files each containing only full valid SQL statements (no statements can be split between two files).

    Read the article

  • upgrading servers, need to keep domain same as before. what are the best practices?

    - by nLL
    Hi, I am upgrading a domain controller/file server from win2003 standard to win2008 r2 standard. We are planing to have a file server and an AD controller. Our old hardware will be scrapped, we want to copy all AD users/computers to new machine and keep current domain name. I never done this before. What are the best practices? Is it better if we get a contractor to do it for us? I guess best way to start is to build new servers, copy data, take old server down and put new server online. My gut says we would need to re-join all computers. Is that correct? Any input appreciated.

    Read the article

  • How can I move authorized applications between google accounts?

    - by zoopp
    I'm looking into creating an email address with a professional name on gmail and due to the fact that I can't change my current one I have to create new google account. Among some things which which need to be patched (eg. forwarding email to the new address until every other account's email contact address is changed etc.) I came across authorized applications. If I am to use exclusively the new email address I have to somehow move my authorized applications as well since if I am to eventually delete my old account I will lose access to my current profiles created by those applications (eg. the stackexchange network, youtube etc). How can this move be accomplished?

    Read the article

  • How to copy data (clone) from one partition to another in Windows XP?

    - by Martin
    I have installed a new hard drive in our PC running Windows XP and I wonder how to transfer the data from the old (small) data partition to the new (large) one. My question concerns only a data partition containing files and folders (not the boot partition with the Operating System files!) Is it ok to just copy the folders in the Windows XP Explorer to the new partition? Could anything be lost this way (hidden folders, metadata, ..)? What is the best way to clone a data partition in Windows XP?

    Read the article

  • Access to certain files but not others

    - by ADW
    Hoping someone can help me as I have, thus far, been unable to solve the issue. I am running a media center utilizing Ubuntu 12.04. I was initially successful accessing media files from the desktop running Ubuntu via my Windows 7 laptop and Roku device. I started backing up a new batch of DVD's I had (into MKV files, like everything else in my media folders) and noticed I cannot access the new files from either the Roku or the laptop. I have not changed any settings in the media folder and verified the shared permissions. The parent folder (Media) is shared (with permission flow-down) while the subfolders (Movies, TV Shows, Music) are not. I have changed the permissions on this to include shared when the access problem arose but with no success. I can only access the original files uploaded an not new files added. Any suggestions??? Thanks in advance for any and all help.

    Read the article

  • Problem creating sitecollection when database attached has been deleted

    - by pkspt
    Hi, I created one new sitecollection in new contentdatabase but later i changed my mind and decided to create it in the same database as of the webapplication. I deleted the sitecollection I created through newdatabase and bt now when I am creating that site again its getting attached to the new database which got created above not the same one? Any solution for that I tried deleting database which got created after deleting site collection. Now, when I when I am trying to create a site its looking for old database and giving me error as its not able to open it. Any suggestions on this one..?

    Read the article

  • Is there a way to edit an existing nautilus (file manager) bookmark?

    - by C.W.Holeman II
    Is there a way to edit an existing nautilus (fie manager) bookmark? Invoke from Linux command line: $ nautilus Activate connection editor: File>Connect To Server...> Complete entries in the pop up: Service Type: [WebDAV (HTTP)] Server: [localhost] Port: [8001] Folder [webdav] Username: [test] [x] Add bookmark Bookmark name: [/dav] <Connect> Then in the left column of the main window the new connection and bookmark exist: Places ------------------- ausername Desktop File System Network WebDAV on localhost Trash -------------------- /dav Right click on "/dav" pop up menu: Open Open in New Tab Open in New Window ------------------ Remove Rename... There is no option for editing.

    Read the article

  • What are the pros of switching DNS names with a database server hardware upgrade?

    - by wilbbe01
    When we upgrade to new hardware at work we usually increment a number in the DNS name. For example. We have a server called database-2, that is slated to become database-3 in the coming days. I haven't been able to find a good reason why this is good behavior. To me the work of trying to catch all end user machines, as well as all servers dependent on the database server is far riskier than simply moving the database and ip/name with it to the new hardware. A little over a year ago we spent several months of requests coming in, as infrequent users began using software that needed to be updated to point to a new DNS name. I am struggling to find answers as to why this is a good practice. So the question. Why is using DNS names as a "server hardware version identifier" a good idea? What am I overlooking? Thanks much.

    Read the article

  • EF4 CTP5 Conflicting changes detected. This may happen when trying to insert multiple entities with the same key.

    - by user658332
    hi, I am new to EF4 CTP5. I am just hanging at one problem.. I am using CodeFirst without Database, so when i execute application, it generated DB for me. here is my scenario, i have following Class structure... public class KVCalculationWish { public KVCalculationWish() { } public int KVCalculationWishId { get; set; } public string KVCalculationWishName { get; set; } public int KVSingleOfferId { get; set; } public virtual KVSingleOffer SingleOffer { get; set; } public int KVCalculationsForPersonId { get; set; } public virtual KVCalculationsForPerson CaculationsForPerson { get; set; } } public class KVSingleOffer { public KVSingleOffer() { } public int KVSingleOfferId { get; set; } public string KVSingleOfferName { get; set; } public KVCalculationWish CalculationWish { get; set; } } public class KVCalculationsForPerson { public KVCalculationsForPerson() { } public int KVCalculationsForPersonId { get; set; } public string KVCalculationsForPersonName { get; set; } public KVCalculationWish CalculationWish { get; set; } } public class EntiyRelation : DbContext { public EntiyRelation() { } public DbSet<KVCalculationWish> CalculationWish { get; set; } public DbSet<KVSingleOffer> SingleOffer { get; set; } public DbSet<KVCalculationsForPerson> CalculationsForPerson { get; set; } protected override void OnModelCreating(System.Data.Entity.ModelConfiguration.ModelBuilder modelBuilder) { base.OnModelCreating(modelBuilder); modelBuilder.Entity<KVCalculationWish>().HasOptional(m => m.SingleOffer).WithRequired(p => p.CalculationWish); modelBuilder.Entity<KVCalculationWish>().HasOptional(m => m.CaculationsForPerson).WithRequired(p => p.CalculationWish); } } i want to use KVCalcuationWish object in KVCalcuationsForPerson and KVSingleOffer class. So when i am creating object of KVCalcuationForPerson and KVSingleOffer class i initialize both object with New KVCalcuationWish object. like this KVCalculationsForPerson calcPerson = new KVCalculationsForPerson(); KVCalculationWish wish = new KVCalculationWish() { CaculationsForPerson = calcPerson }; calcPerson.KVCalculationsForPersonName = "Person Name"; calcPerson.CalculationWish = wish; KVSingleOffer singleOffer = new KVSingleOffer(); KVCalculationWish wish1 = new KVCalculationWish() { SingleOffer = singleOffer }; singleOffer.KVSingleOfferName = "Offer Name"; singleOffer.CalculationWish = wish1; but my problem is when i save this records using following code try { db.CalculationsForPerson.Add(calcPerson); db.SingleOffer.Add(singleOffer); db.SaveChanges(); } catch (Exception ex) { } i can save successfully in DB, but in Table KVCalcuationWish i am not able to get the ID of SingleOffer and CalcuationsForPerson class object. Following is the data of KVCalcuationWish table. KVCalcuationWishID KVCalcuationWishName KVSingleOfferID KVCalcuationsForPersonID 1 NULL 0 0 Following is the data of KVSingleOFfer Table KVSingleOfferID KVSingleOfferName 1 Offer Name Follwing is the data of KVCalcuationsForPerson Table KVSingleOfferID KVSingleOfferName 1 Person Name I want to have following possible output in KVCalcuationWish table. KVCalcuationWishID KVCalcuationWishName KVSingleOfferID KVCalcuationsForPersonID 1 NULL 1 NULL 2 NULL NULL 1 so what i want to achieve is ...... when i am save KVSingleOffer object i want separate record to be inserted and when i save KVCalcuationsForPerson object another separate record should be save to KVCalcuationwish table. Is that possible? Sorry for long description... but i really hang on this situation... Thanks & Regards, Joyous Suhas

    Read the article

  • Sharepoint web services -- The HTTP request is unauthorized with client authentication scheme 'Ntlm'

    - by Pandincus
    I know there's a lot of questions on SO similar to this, but I couldn't find one for this particular issue. A couple of points, first: I have no control over our Sharepoint server. I cannot tweak any IIS settings. I believe our IIS server version is IIS 7.0. Our Sharepoint Server is anticipating requests via NTLM. Our Sharepoint Server is on the same domain as my client computer. I am using .NET Framework 3.5, Visual Studio 2008 I am trying to write a simple console app to manipulate Sharepoint data using Sharepoint Web Services. I have added the Service Reference, and the following is my app.config: <system.serviceModel> <bindings> <basicHttpBinding> <binding name="ListsSoap" closeTimeout="00:01:00" openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:01:00" allowCookies="false" bypassProxyOnLocal="false" hostNameComparisonMode="StrongWildcard" maxBufferSize="65536" maxBufferPoolSize="524288" maxReceivedMessageSize="65536" messageEncoding="Text" textEncoding="utf-8" transferMode="Buffered" useDefaultWebProxy="true"> <readerQuotas maxDepth="32" maxStringContentLength="8192" maxArrayLength="16384" maxBytesPerRead="4096" maxNameTableCharCount="16384" /> <security mode="Transport"> <transport clientCredentialType="Ntlm" proxyCredentialType="Ntlm" /> </security> </binding> </basicHttpBinding> </bindings> <client> <endpoint address="https://subdomain.companysite.com/subsite/_vti_bin/Lists.asmx" binding="basicHttpBinding" bindingConfiguration="ListsSoap" contract="ServiceReference1.ListsSoap" name="ListsSoap" /> </client> </system.serviceModel> This is my code: static void Main(string[] args) { using (var client = new ListsSoapClient()) { client.ClientCredentials.Windows.ClientCredential = new NetworkCredential("username", "password", "domain"); client.GetListCollection(); } } When I call GetListCollection(), the following MessageSecurityException gets thrown: The HTTP request is unauthorized with client authentication scheme 'Ntlm'. The authentication header received from the server was 'NTLM'. With an inner WebException: "The remote server returned an error: (401) Unauthorized." I've tried various bindings and various code tweaks to try to authenticate properly, but to no avail. I'll list those below. I've tried the following steps: Using a native Win32 Impersonator before creating the client using (new Impersonator.Impersonator("username", "password", "domain")) using (var client = new ListsSoapClient()) { client.ClientCredentials.Windows.ClientCredential = new NetworkCredential("dpincas", "password", "domain"); client.GetListCollection(); } This produced the same error message. Setting TokenImpersonationLevel for my client credentials using (var client = new ListsSoapClient()) { client.ClientCredentials.Windows.AllowedImpersonationLevel = TokenImpersonationLevel.Impersonation; client.GetListCollection(); } This produced the same error message. Using security mode=TransportCredentialOnly <security mode="TransportCredentialOnly"> <transport clientCredentialType="Ntlm" /> </security> This resulted in a different error message: The provided URI scheme 'https' is invalid; expected 'http'. Parameter name: via However, I need to use https, so I cannot change my URI scheme. I've tried some other combinations that I can't remember, but I'll post them when I do. I'm really at wits end here. I see a lot of links on Google that say "switch to Kerberos", but my server seems to only be accepting NTLM, not "Negotiate" (as it would say if it was looking for Kerberos), so that is unfortunately not an option. Any help out there, folks?

    Read the article

  • C# SOCKS proxy service for HTTP requests

    - by Ed
    I'm trying to build a service that will forward HTTP requests from agents like a browser to the Tor service. Problem is, the Tor service only accepts SOCKS4a connections. So my solution is to listen for HTTP requests, get the URL they're requesting, and make a request via Tor with the help of the Starksoft.Net.Proxy library. Then return the response. The library kind of works, but I'm not happy. It returns HTTP headers with the response and it can't handle images. So the responses are messed up. How could I improve my code? I'm very new to network programming. Sorry for the long example. public AnonymiserService(ILogger logger) { try { _logger = logger; _logger.Log("Listening on port {0}...", Properties.Settings.Default.ListeningPort); StartListener(new string[] { string.Format("http://*:{0}/", Properties.Settings.Default.ListeningPort) }); } catch (Exception ex) { _logger.LogError("Exception!", ex); } } private void StartListener(string[] prefixes) { if (!HttpListener.IsSupported) { _logger.LogError("HttpListener isn't supported on this machine!"); return; } HttpListener listener = new HttpListener(); foreach (string s in prefixes) listener.Prefixes.Add(s); while (true) { listener.Start(); IAsyncResult result = listener.BeginGetContext(new AsyncCallback(ListenerCallback), listener); result.AsyncWaitHandle.WaitOne(); } } private void ListenerCallback(IAsyncResult result) { try { // Get HTTP request HttpListener listener = (HttpListener)result.AsyncState; HttpListenerContext context = listener.EndGetContext(result); _logger.Log("Retrieving [{0}]", context.Request.RawUrl); // Create connection // Use Tor as proxy IProxyClient proxyClient = new Socks4aProxyClient("localhost", 9050); TcpClient tcpClient = proxyClient.CreateConnection(context.Request.UserHostName, 80); // Create message // Need to set Connection: close to close the connection as soon as it's done byte[] data = Encoding.UTF8.GetBytes(String.Format("GET {0} HTTP/1.1\r\nHost: {1}\r\nConnection: close\r\n\r\n", context.Request.Url.PathAndQuery, context.Request.UserHostName)); // Send message NetworkStream ns = tcpClient.GetStream(); ns.Write(data, 0, data.Length); // Pass on HTTP response HttpListenerResponse responseOut = context.Response; if (ns.CanRead) { byte[] buffer = new byte[32768]; int read = 0; string responseString = string.Empty; // Read response while ((read = ns.Read(buffer, 0, buffer.Length)) > 0) { responseString += Encoding.UTF8.GetString(buffer, 0, read); } // Remove headers if (responseString.IndexOf("HTTP/1.1 200 OK") > -1) responseString = responseString.Substring(responseString.IndexOf("\r\n\r\n")); // Forward response byte[] byteArray = Encoding.UTF8.GetBytes(responseString); responseOut.OutputStream.Write(byteArray, 0, byteArray.Length); } // Close streams responseOut.OutputStream.Close(); ns.Close(); // Close connection tcpClient.Close(); _logger.Log("Retrieved [{0}]", context.Request.RawUrl); } catch (Exception ex) { _logger.LogError("Exception in ListenerCallback!", ex); } }

    Read the article

  • How to fix basicHttpBinding in WCF when using multiple proxy clients?

    - by Hemant
    [Question seems a little long but please have patience. It has sample source to explain the problem.] Consider following code which is essentially a WCF host: [ServiceContract (Namespace = "http://www.mightycalc.com")] interface ICalculator { [OperationContract] int Add (int aNum1, int aNum2); } [ServiceBehavior (InstanceContextMode = InstanceContextMode.PerCall)] class Calculator: ICalculator { public int Add (int aNum1, int aNum2) { Thread.Sleep (2000); //Simulate a lengthy operation return aNum1 + aNum2; } } class Program { static void Main (string[] args) { try { using (var serviceHost = new ServiceHost (typeof (Calculator))) { var httpBinding = new BasicHttpBinding (BasicHttpSecurityMode.None); serviceHost.AddServiceEndpoint (typeof (ICalculator), httpBinding, "http://172.16.9.191:2221/calc"); serviceHost.Open (); Console.WriteLine ("Service is running. ENJOY!!!"); Console.WriteLine ("Type 'stop' and hit enter to stop the service."); Console.ReadLine (); if (serviceHost.State == CommunicationState.Opened) serviceHost.Close (); } } catch (Exception e) { Console.WriteLine (e); Console.ReadLine (); } } } Also the WCF client program is: class Program { static int COUNT = 0; static Timer timer = null; static void Main (string[] args) { var threads = new Thread[10]; for (int i = 0; i < threads.Length; i++) { threads[i] = new Thread (Calculate); threads[i].Start (null); } timer = new Timer (o => Console.WriteLine ("Count: {0}", COUNT), null, 1000, 1000); Console.ReadLine (); timer.Dispose (); } static void Calculate (object state) { var c = new CalculatorClient ("BasicHttpBinding_ICalculator"); c.Open (); while (true) { try { var sum = c.Add (2, 3); Interlocked.Increment (ref COUNT); } catch (Exception ex) { Console.WriteLine ("Error on thread {0}: {1}", Thread.CurrentThread.Name, ex.GetType ()); break; } } c.Close (); } } Basically, I am creating 10 proxy clients and then repeatedly calling Add service method on separate threads. Now if I run both applications and observe opened TCP connections using netstat, I find that: If both client and server are running on same machine, number of tcp connections are equal to number of proxy objects. It means all requests are being served in parallel. Which is good. If I run server on a separate machine, I observed that maximum 2 TCP connections are opened regardless of the number of proxy objects I create. Only 2 requests run in parallel. It hurts the processing speed badly. If I switch to net.tcp binding, everything works fine (a separate TCP connection for each proxy object even if they are running on different machines). I am very confused and unable to make the basicHttpBinding use more TCP connections. I know it is a long question, but please help!

    Read the article

  • Use LibTIff in C# to convert from one tiff format to another

    - by Kevin
    I have a Tiff using JPEG format the WPF / C# can not handle via TiffBitmapDecoder. Our clients use the file format and our current C++ and Java code handles it. I need to convert this to a format I can display using TiffBitmapDecoder or standard BitmapImage. It looks like the C# version of LibTiff is the way to go but I am not having any luck converting in code. Here is my attempt - I always end up with corrupt files. ` Boolean doSystemLoad = false; Tiff tiff = null; try { tiff = Tiff.Open(file, "r"); } catch (Exception e) // TIFF could not handle, let OS do it { doSystemLoad = true; } if (tiff != null) { width = Double.Parse(tiff.GetField(TiffTag.IMAGEWIDTH)[0].Value.ToString()); height = Double.Parse(tiff.GetField(TiffTag.IMAGELENGTH)[0].Value.ToString()); int bits = Int32.Parse(tiff.GetField(TiffTag.BITSPERSAMPLE)[0].Value.ToString()); int samples = Int32.Parse(tiff.GetField(TiffTag.SAMPLESPERPIXEL)[0].Value.ToString()); string compression = tiff.GetField(TiffTag.COMPRESSION)[0].Value.ToString(); Console.WriteLine("Image is " + width + " x " + height + " bits " + bits + " sample " + samples); Console.WriteLine("Compression " + compression); // We allow OS to load anything that is not JPEG compression doSystemLoad = compression.ToLower().IndexOf("jpeg") == -1; string tempFile = Path.GetTempFileName() + ".tiff"; // Convert here then load converted via OS if (!doSystemLoad) { Console.WriteLine(">> Attempting to convert... " + tempFile); Console.WriteLine(" Scan line " + tiff.ScanlineSize()); Tiff tiffOut = Tiff.Open(tempFile, "w"); tiffOut.SetField(TiffTag.IMAGEWIDTH, width); tiffOut.SetField(TiffTag.IMAGELENGTH, height); tiffOut.SetField(TiffTag.BITSPERSAMPLE, bits); tiffOut.SetField(TiffTag.SAMPLESPERPIXEL, samples); tiffOut.SetField(TiffTag.ROWSPERSTRIP, 1L); tiffOut.SetField(TiffTag.COMPRESSION, Compression.NONE); tiffOut.SetField(TiffTag.ORIENTATION, BitMiracle.LibTiff.Classic.Orientation.TOPLEFT); tiffOut.SetField(TiffTag.FAXMODE, FaxMode.CLASSF); tiffOut.SetField(TiffTag.GROUP3OPTIONS, 5); tiffOut.SetField(TiffTag.PHOTOMETRIC, Photometric.RGB); tiffOut.SetField(TiffTag.FILLORDER, FillOrder.MSB2LSB); tiffOut.SetField(TiffTag.PLANARCONFIG, PlanarConfig.CONTIG); tiffOut.SetField(TiffTag.RESOLUTIONUNIT, ResUnit.INCH); tiffOut.SetField(TiffTag.XRESOLUTION, 100.0); tiffOut.SetField(TiffTag.YRESOLUTION, 100.0); tiffOut.SetField(TiffTag.SUBFILETYPE, FileType.PAGE); tiffOut.SetField(TiffTag.PAGENUMBER, new object[] { 1, 1 }); tiffOut.SetField(TiffTag.PAGENAME, "Page 1"); Byte[] scanLine = new Byte[tiff.ScanlineSize() + 5000]; for (int row = 0; row < height; row++) { tiff.ReadScanline(scanLine, row); tiffOut.WriteScanline(scanLine, row); } tiffOut.Dispose(); } tiff.Dispose(); Stream imageStreamSource = new FileStream(tempFile, FileMode.Open, FileAccess.Read, FileShare.Read); TiffBitmapDecoder decoder = new TiffBitmapDecoder(imageStreamSource, BitmapCreateOptions.PreservePixelFormat, BitmapCacheOption.Default); BitmapSource bitmapSource = decoder.Frames[0]; width = bitmapSource.Width; height = bitmapSource.Height; imageMain.Width = width; imageMain.Height = height; imageMain.Source = bitmapSource; } if (doSystemLoad) { Stream imageStreamSource = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read); TiffBitmapDecoder decoder = new TiffBitmapDecoder(imageStreamSource, BitmapCreateOptions.PreservePixelFormat, BitmapCacheOption.Default); BitmapSource bitmapSource = decoder.Frames[0]; width = bitmapSource.Width; height = bitmapSource.Height; imageMain.Width = width; imageMain.Height = height; imageMain.Source = bitmapSource; } `

    Read the article

  • SMO ConnectionContext.StatementTimeout setting is ignored

    - by Woody
    I am successfully using Powershell with SMO to backup most databases. However, I have several large databases in which I receive a "timeout" error "System.Data.SqlClient.SqlException: Timeout expired". The timout consistently occurs at 10 minutes. I have tried setting ConnectionContext.StatementTimeout to 0, 6000, and to [System.Int32]::MaxValue. The setting made no difference. I have found a number of Google references which indicate setting it to 0 makes it unlimited. No matter what I try, the timeouts consistently occur at 10 minutes. I even set Remote Query Timeout on the server to 0 (via Studio Manager) to no avail. Below is my SMO connection where I set the time out and the actual backup function. Further below is the output from my script. UPDATE Interestingly enough, I wrote the backup function in C# using VS 2008 and the timeout override does work within that environment. I am in the process of incorporating that C# process into my Powershell Script until I can find out why the timeout override does not work with just Powershell. This is extremely annoying! function New-SMOconnection { Param ($server, $ApplicationName= "PowerShell SMO", [int]$StatementTimeout = 0 ) # Write-Debug "Function: New-SMOconnection $server $connectionname $commandtimeout" if (test-path variable:\conn) { $conn.connectioncontext.disconnect() } else { $conn = New-Object('Microsoft.SqlServer.Management.Smo.Server') $server } $conn.connectioncontext.applicationName = $applicationName $conn.ConnectionContext.StatementTimeout = $StatementTimeout $conn.connectioncontext.Connect() $conn } $smo = New-SMOConnection -server $server if ($smo.connectioncontext.isopen -eq $false) { Throw "Could not connect to server $($server)." } Function Backup-Database { Param([string]$dbname) $db = $smo.Databases.get_Item($dbname) if (!$db) {"Database $dbname was not found"; Return} $sqldir = $smo.Settings.BackupDirectory + "\$($smo.name -replace ("\\", "$"))" $s = ($server.Split('\'))[0] $basedir = "\\$s\" + $($sqldir -replace (":", "$")) $dt = get-date -format yyyyMMdd-HHmmss $dbbk = new-object ('Microsoft.SqlServer.Management.Smo.Backup') $dbbk.Action = 'Database' $dbbk.BackupSetDescription = "Full backup of " + $dbname $dbbk.BackupSetName = $dbname + " Backup" $dbbk.Database = $dbname $dbbk.MediaDescription = "Disk" $target = "$basedir\$dbname\FULL" if (-not(Test-Path $target)) { New-Item $target -ItemType directory | Out-Null} $device = "$sqldir\$dbname\FULL\" + $($server -replace("\\", "$")) + "_" + $dbname + "_FULL_" + $dt + ".bak" $dbbk.Devices.AddDevice($device, 'File') $dbbk.Initialize = $True $dbbk.Incremental = $false $dbbk.LogTruncation = [Microsoft.SqlServer.Management.Smo.BackupTruncateLogType]::Truncate If (!$copyonly) { If ($kill) {$smo.KillAllProcesses($dbname)} $dbbk.SqlBackupAsync($server) } $dbbk } Started SQL backups for server LCFSQLxxx\SQLxxx at 05/06/2010 15:33:16 Statement TimeOut value set to 0. DatabaseName : OperationsManagerDW StartBackupTime : 5/6/2010 3:33:16 PM EndBackupTime : 5/6/2010 3:43:17 PM StartCopyTime : 1/1/0001 12:00:00 AM EndCopyTime : 1/1/0001 12:00:00 AM CopiedFiles : Status : Failed ErrorMessage : System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. The backup or restore was aborted. 10 percent processed. 20 percent processed. 30 percent processed. 40 percent processed. 50 percent processed. 60 percent processed. 70 percent processed. at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) at System.Data.SqlClient.SqlCommand.RunExecuteNonQueryTds(String methodName, Boolean async) at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(DbAsyncResult result, String methodName, Boolean sendToPipe) at System.Data.SqlClient.SqlCommand.ExecuteNonQuery() at Microsoft.SqlServer.Management.Common.ServerConnection.ExecuteNonQuery(String sqlCommand, ExecutionTypes executionType) Ended backups at 05/06/2010 15:43:23

    Read the article

  • Small performance test on a web service

    - by vtortola
    Hi, I'm trying to develop a small application that test how many requests per second can my service support but I think I'm doing something wrong. The service is in an early development stage, but I'd like to have this test handy in order to check in time to time I'm not doing something that decrease the performance. The problem is that I cannot get the web server or the database server go to the 100% of CPU. I'm using three different computers, in one is the web server (WinSrv Standard 2008 x64 IIS7), in other the database (Win 2K - SQL Server 2005) and the last is my computer (Win7 x64 ultimate), where I'll run the test. The computers are connected through a 100 ethernet switch. The request POST is 9 bytes and the response will be 842 bytes. The test launches several threads, and each thread has a "while" loop, in each loop it creates a WebRequest object, performs a call, increment a common counter and waits between 1 and 5 millisencods, then it do it again: static Int32 counter = 0; static void Main(string[] args) { ServicePointManager.DefaultConnectionLimit = 250; Console.WriteLine("Ready. Press any key..."); Console.ReadKey(); Console.WriteLine("Running..."); String localhost = "localhost"; String linuxmono = "192.168.1.74"; String server= "192.168.1.5:8080"; DateTime start = DateTime.Now; Random r = new Random(DateTime.Now.Millisecond); for (int i = 0; i < 50; i++) { new Thread(new ParameterizedThreadStart(Test)).Start(server); Thread.Sleep(r.Next(1, 3)); } Thread.Sleep(2000); while (true) { Console.WriteLine("Request per second :" + counter/DateTime.Now.Subtract(start).TotalSeconds ); Thread.Sleep(3000); } } public static void Test(Object ip) { Guid guid = Guid.NewGuid(); Random r = new Random(DateTime.Now.Millisecond); while (true) { String test = "<lalala/>"; WebRequest req = WebRequest.Create("http://" + (String)ip + "/WebApp/"+guid.ToString()+"/Data/Tables=whatever"); req.Method = "POST"; req.ContentType = "application/xml"; req.Credentials = new NetworkCredential("aaa", "aaa","domain"); Byte[] array = Encoding.UTF8.GetBytes(test); req.ContentLength = array.Length; using (Stream reqStream = req.GetRequestStream()) { reqStream.Write(array, 0, array.Length); reqStream.Close(); } using (Stream responseStream = req.GetResponse().GetResponseStream()) { String response = new StreamReader(responseStream).ReadToEnd(); if (response.Length != 842) Console.Write(" EEEE "); } Interlocked.Increment(ref counter); Thread.Sleep(r.Next(1,5)); } } If I run the test neither of the computers do an excesive CPU usage. Let's say I get a X requests per second, if I run the console application two times at the same moment, I get X/2 request per second in each one... but still the web server is on 30% of CPU, the database server on 25%... I've tried to remove the thread.sleep in the loop, but it doesn't make a big difference. I'd like to put the machines to the maximun, to check how may requests per second they can provide. I guessed that I could do it in this way... but apparently I'm missing something here... What is the problem? Kind regards.

    Read the article

< Previous Page | 579 580 581 582 583 584 585 586 587 588 589 590  | Next Page >