Search Results

Search found 17749 results on 710 pages for 'connection pool'.

Page 464/710 | < Previous Page | 460 461 462 463 464 465 466 467 468 469 470 471  | Next Page >

  • Mindtouch with fcgid - Fast CGI apache worker thread.

    - by Stephan Kristyn
    Anyone got Dekiwiki / Mindtouch running with fcgid-module? I get 504 and 500 all the time. mod_fcgid: can't apply process slot for /var/www/html/dekiwiki/index.php [Tue Dec 28 06:14:03 2010] [warn] (104)Connection reset by peer: mod_fcgid: read data from fastcgi server error. [Tue Dec 28 06:14:03 2010] [error] [client 92.75.107.53] Premature end of script headers: index.php I'm currently fiddling with SuExec and fast-cgi wrapper directory permissions, because I also employ a chrooted SFTP jail. Sometimes the first line about the process slot does not appear now. I found a solution in german and will work it through now. http://debianforum.de/forum/viewtopic.php?f=8&t=122758&start=15

    Read the article

  • Install NPM Packages Automatically for Node.js on Windows Azure Web Site

    - by Shaun
    In one of my previous post I described and demonstrated how to use NPM packages in Node.js and Windows Azure Web Site (WAWS). In that post I used NPM command to install packages, and then use Git for Windows to commit my changes and sync them to WAWS git repository. Then WAWS will trigger a new deployment to host my Node.js application. Someone may notice that, a NPM package may contains many files and could be a little bit huge. For example, the “azure” package, which is the Windows Azure SDK for Node.js, is about 6MB. Another popular package “express”, which is a rich MVC framework for Node.js, is about 1MB. When I firstly push my codes to Windows Azure, all of them must be uploaded to the cloud. Is that possible to let Windows Azure download and install these packages for us? In this post, I will introduce how to make WAWS install all required packages for us when deploying.   Let’s Start with Demo Demo is most straightforward. Let’s create a new WAWS and clone it to my local disk. Drag the folder into Git for Windows so that it can help us commit and push. Please refer to this post if you are not familiar with how to use Windows Azure Web Site, Git deployment, git clone and Git for Windows. And then open a command windows and install a package in our code folder. Let’s say I want to install “express”. And then created a new Node.js file named “server.js” and pasted the code as below. 1: var express = require("express"); 2: var app = express(); 3: 4: app.get("/", function(req, res) { 5: res.send("Hello Node.js and Express."); 6: }); 7: 8: console.log("Web application opened."); 9: app.listen(process.env.PORT); If we switch to Git for Windows right now we will find that it detected the changes we made, which includes the “server.js” and all files under “node_modules” folder. What we need to upload should only be our source code, but the huge package files also have to be uploaded as well. Now I will show you how to exclude them and let Windows Azure install the package on the cloud. First we need to add a special file named “.gitignore”. It seems cannot be done directly from the file explorer since this file only contains extension name. So we need to do it from command line. Navigate to the local repository folder and execute the command below to create an empty file named “.gitignore”. If the command windows asked for input just press Enter. 1: echo > .gitignore Now open this file and copy the content below and save. 1: node_modules Now if we switch to Git for Windows we will found that the packages under the “node_modules” were not in the change list. So now if we commit and push, the “express” packages will not be uploaded to Windows Azure. Second, let’s tell Windows Azure which packages it needs to install when deploying. Create another file named “package.json” and copy the content below into that file and save. 1: { 2: "name": "npmdemo", 3: "version": "1.0.0", 4: "dependencies": { 5: "express": "*" 6: } 7: } Now back to Git for Windows, commit our changes and push it to WAWS. Then let’s open the WAWS in developer portal, we will see that there’s a new deployment finished. Click the arrow right side of this deployment we can see how WAWS handle this deployment. Especially we can find WAWS executed NPM. And if we opened the log we can review what command WAWS executed to install the packages and the installation output messages. As you can see WAWS installed “express” for me from the cloud side, so that I don’t need to upload the whole bunch of the package to Azure. Open this website and we can see the result, which proved the “express” had been installed successfully.   What’s Happened Under the Hood Now let’s explain a bit on what the “.gitignore” and “package.json” mean. The “.gitignore” is an ignore configuration file for git repository. All files and folders listed in the “.gitignore” will be skipped from git push. In the example below I copied “node_modules” into this file in my local repository. This means,  do not track and upload all files under the “node_modules” folder. So by using “.gitignore” I skipped all packages from uploading to Windows Azure. “.gitignore” can contain files, folders. It can also contain the files and folders that we do NOT want to ignore. In the next section we will see how to use the un-ignore syntax to make the SQL package included. The “package.json” file is the package definition file for Node.js application. We can define the application name, version, description, author, etc. information in it in JSON format. And we can also put the dependent packages as well, to indicate which packages this Node.js application is needed. In WAWS, name and version is necessary. And when a deployment happened, WAWS will look into this file, find the dependent packages, execute the NPM command to install them one by one. So in the demo above I copied “express” into this file so that WAWS will install it for me automatically. I updated the dependencies section of the “package.json” file manually. But this can be done partially automatically. If we have a valid “package.json” in our local repository, then when we are going to install some packages we can specify “--save” parameter in “npm install” command, so that NPM will help us upgrade the dependencies part. For example, when I wanted to install “azure” package I should execute the command as below. Note that I added “--save” with the command. 1: npm install azure --save Once it finished my “package.json” will be updated automatically. Each dependent packages will be presented here. The JSON key is the package name while the value is the version range. Below is a brief list of the version range format. For more information about the “package.json” please refer here. Format Description Example version Must match the version exactly. "azure": "0.6.7" >=version Must be equal or great than the version. "azure": ">0.6.0" 1.2.x The version number must start with the supplied digits, but any digit may be used in place of the x. "azure": "0.6.x" ~version The version must be at least as high as the range, and it must be less than the next major revision above the range. "azure": "~0.6.7" * Matches any version. "azure": "*" And WAWS will install the proper version of the packages based on what you defined here. The process of WAWS git deployment and NPM installation would be like this.   But Some Packages… As we know, when we specified the dependencies in “package.json” WAWS will download and install them on the cloud. For most of packages it works very well. But there are some special packages may not work. This means, if the package installation needs some special environment restraints it might be failed. For example, the SQL Server Driver for Node.js package needs “node-gyp”, Python and C++ 2010 installed on the target machine during the NPM installation. If we just put the “msnodesql” in “package.json” file and push it to WAWS, the deployment will be failed since there’s no “node-gyp”, Python and C++ 2010 in the WAWS virtual machine. For example, the “server.js” file. 1: var express = require("express"); 2: var app = express(); 3: 4: app.get("/", function(req, res) { 5: res.send("Hello Node.js and Express."); 6: }); 7:  8: var sql = require("msnodesql"); 9: var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:tqy4c0isfr.database.windows.net,1433;Database=msteched2012;Uid=shaunxu@tqy4c0isfr;Pwd=P@ssw0rd123;Encrypt=yes;Connection Timeout=30;"; 10: app.get("/sql", function (req, res) { 11: sql.open(connectionString, function (err, conn) { 12: if (err) { 13: console.log(err); 14: res.send(500, "Cannot open connection."); 15: } 16: else { 17: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 18: if (err) { 19: console.log(err); 20: res.send(500, "Cannot retrieve records."); 21: } 22: else { 23: res.json(results); 24: } 25: }); 26: } 27: }); 28: }); 29: 30: console.log("Web application opened."); 31: app.listen(process.env.PORT); The “package.json” file. 1: { 2: "name": "npmdemo", 3: "version": "1.0.0", 4: "dependencies": { 5: "express": "*", 6: "msnodesql": "*" 7: } 8: } And it failed to deploy to WAWS. From the NPM log we can see it’s because “msnodesql” cannot be installed on WAWS. The solution is, in “.gitignore” file we should ignore all packages except the “msnodesql”, and upload the package by ourselves. This can be done by use the content as below. We firstly un-ignored the “node_modules” folder. And then we ignored all sub folders but need git to check each sub folders. And then we un-ignore one of the sub folders named “msnodesql” which is the SQL Server Node.js Driver. 1: !node_modules/ 2:  3: node_modules/* 4: !node_modules/msnodesql For more information about the syntax of “.gitignore” please refer to this thread. Now if we go to Git for Windows we will find the “msnodesql” was included in the uncommitted set while “express” was not. I also need remove the dependency of “msnodesql” from “package.json”. Commit and push to WAWS. Now we can see the deployment successfully done. And then we can use the Windows Azure SQL Database from our Node.js application through the “msnodesql” package we uploaded.   Summary In this post I demonstrated how to leverage the deployment process of Windows Azure Web Site to install NPM packages during the publish action. With the “.gitignore” and “package.json” file we can ignore the dependent packages from our Node.js and let Windows Azure Web Site download and install them while deployed. For some special packages that cannot be installed by Windows Azure Web Site, such as “msnodesql”, we can put them into the publish payload as well. With the combination of Windows Azure Web Site, Node.js and NPM it makes even more easy and quick for us to develop and deploy our Node.js application to the cloud.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Upgraded from VS 2008 -> VS 2010. Can't Connect to SQL Server in Staging Environment

    - by Bob Kaufman
    I have a test application written in C#/ASP.NET that I've developed using Visual Studio 2008 Professional/.NET 3.5 which connects to a local SQL Server 2008 Express instance. I upgraded the development machine to Visual Studio 2010 Professional maintaining .NET 3.5 and everything in the development environment continues to work correctly. Upon deployment of the new app to an internal staging machine, that app cannot connect to its local SQL Server 2008 Express database. I get the customary "server not found" error: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible... Does something need to be upgraded on the staging machine to be able to host a Visual Studio 2010/.NET 3.5 application?

    Read the article

  • Recommend a wireless PCI card for Windows 7

    - by Dan
    I have a crummy RT2500-based 11g card which does work in Win7 with the Vista driver (3.2.0.0) but it dies about every two hours or so. Googling around has led me to conclude that Ralink drivers are basically borked, and that I need something else for a stable connection. Can anyone recommend a suitable wireless adapter? It needs to be: 802.11g - draft-N nice but not at all essential. PCI - I already have far more USB devices than can possibly be good for me. Very reliable. Money isn't an object within reason. All input gratefully received!

    Read the article

  • How can I get the Creative Zen Touch to work?

    - by Hello71
    I've tried using gnomad2 (too lazy to configure all the dependencies) and Amarok (segfaulted when I tried to do anything with it). $ lsusb Bus 002 Device 004: ID 041e:4131 Creative Technology, Ltd Zen Touch (mtp) # SNIP OUTPUT # $ lsusb --verbose -s 002:004 Bus 002 Device 004: ID 041e:4131 Creative Technology, Ltd Zen Touch (mtp) Device Descriptor: bLength 18 bDescriptorType 1 bcdUSB 2.00 bDeviceClass 255 Vendor Specific Class bDeviceSubClass 0 bDeviceProtocol 0 bMaxPacketSize0 64 idVendor 0x041e Creative Technology, Ltd idProduct 0x4131 Zen Touch (mtp) bcdDevice 1.00 iManufacturer 1 Creative Technology Ltd iProduct 2 Creative Zen Touch iSerial 3 010125517D039098 bNumConfigurations 1 Configuration Descriptor: bLength 9 bDescriptorType 2 wTotalLength 39 bNumInterfaces 1 bConfigurationValue 1 iConfiguration 16 Configuration 1 bmAttributes 0xc0 Self Powered MaxPower 440mA Interface Descriptor: bLength 9 bDescriptorType 4 bInterfaceNumber 0 bAlternateSetting 0 bNumEndpoints 3 bInterfaceClass 0 (Defined at Interface level) bInterfaceSubClass 0 bInterfaceProtocol 0 iInterface 33 MTP Interface Endpoint Descriptor: bLength 7 bDescriptorType 5 bEndpointAddress 0x01 EP 1 OUT bmAttributes 2 Transfer Type Bulk Synch Type None Usage Type Data wMaxPacketSize 0x0200 1x 512 bytes bInterval 0 Endpoint Descriptor: bLength 7 bDescriptorType 5 bEndpointAddress 0x82 EP 2 IN bmAttributes 2 Transfer Type Bulk Synch Type None Usage Type Data wMaxPacketSize 0x0200 1x 512 bytes bInterval 0 Endpoint Descriptor: bLength 7 bDescriptorType 5 bEndpointAddress 0x83 EP 3 IN bmAttributes 3 Transfer Type Interrupt Synch Type None Usage Type Data wMaxPacketSize 0x0008 1x 8 bytes bInterval 4 Device Qualifier (for other device speed): bLength 10 bDescriptorType 6 bcdUSB 2.00 bDeviceClass 255 Vendor Specific Class bDeviceSubClass 0 bDeviceProtocol 0 bMaxPacketSize0 64 bNumConfigurations 1 can't get debug descriptor: Connection timed out Device Status: 0x0000 (Bus Powered)

    Read the article

  • Unable to modify variables phpmyadmin via variables tab (Xampp)

    - by rookie coder
    I am quite new to phpmyadmin configuration. I had project where utf8 encoding is needed. What i'm trying to do is to change the variables text/char all into utf8. I changed, yes at that moment the values changed into values I wanted. But then when I terminate Xampp and reenters phpmyadmin page or even refreshing the page, all the values restored to default (original values). My phpmyadmin had default user as root and hadn't been set a password yet. There is also no logout button in phpmyadmin landing page. I had difficult time even to set the server connection collation (hangs indefinitely and never seems can be updated). phpmyadmin version:4.1.6 mysql:5.5.36 (latest version) I doubt this could be due to malformed installation, because same things happened in my other computer too (exactly the same versions). what could be wrong? thanks.

    Read the article

  • What is the command to check if a command's results mention OK?

    - by Manuel
    Alright, so I was playing around with changing MTU size and wanted to make a batch file to automatically lower it and then raise it later. This is probably simple, but I just can't figure it out. Point is, is there a way to run a command, which would normally echo out "ok" but check to see if it does say ok? And if it doesn't say ok then, to end the rest of the file from running and exit out. The command I'm using is netsh interface ipv4 set subinterface "Local Area Connection" mtu=386 store=persistent which, as I mentioned above prints out an OK. I just want to check if it did run correctly, and if not, then do __

    Read the article

  • Clean URLs issue using .htaccess in PHP project

    - by x4ph4r
    I am working on a PHP laravel project. I am currently facing issues with .htaccess file. I have following .htaccess: <IfModule mod_rewrite.c> Options -MultiViews RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^ index.php [L] </IfModule> When I reload my page the it gave me following error: 404 Not Found The requested URL /contacts was not found on this server. Then I opened /etc/apache2/users/username.conf file which had following line of code: <Directory "/Users/username/Sites/"> Options Indexes MultiViews AllowOverride None Order allow,deny Allow from all </Directory> In above code I changed AllowOverride None to AllowOverride All. Then I reload page and got following error: 403 Forbidden You don't have permission to access /contacts on this server. When I add FollowSymLinks to .htaccess file Options such as like this Options -MultiViews FollowSymLinks. Then sometimes I get this 500 Internal Server Error error and sometime this *Error 324 (net::ERR_EMPTY_RESPONSE): The server closed the connection without sending any data*. Each time I reload my page one of these errors with FollowSymLinks option. I also uncomment following lines in /etc/apache2/httpd.conf: LoadModule rewrite_module libexec/apache2/mod_rewrite.so LoadModule php5_module libexec/apache2/libphp5.so and still I am getting same permission denied error. Please help me I am trying to solve this problem for past 3 days but it is till unresolved.

    Read the article

  • DIR 601 No wireless internet

    - by ashley
    I have an orange globe on my d link router. I signed into 192.168.0.1 and went to Manuel Internet Connection Setup as I was told to do. When I clicked on that and tried to clone my PC's MAC address, it said invalid MAC address. Host Name : DIR-601 Use Unicasting : (compatibility for some DHCP Servers) Primary DNS Address : 0.0.0.0 Secondary DNS Address : 0.0.0.0 MTU : (bytes) MTU default = 1500 MAC Address : F8:1E:DF:EA:38:E6 How do I get a valid MAC address so I can save the settings and move on to the next steps I was told to do in order to get wireless internet again?

    Read the article

  • Java HttpURLConnection class Program

    - by pandu
    I am learning java. Here is the sample code of HttpURLConnection class usage in some text book import java.net.*; import java.io.*; import java.util.*; class HttpURLDemo { public static void main(String args[]) throws Exception { URL hp = new URL("http://www.google.com"); HttpURLConnection hpCon = (HttpURLConnection) hp.openConnection(); // Display request method. System.out.println("Request method is " + hpCon.getRequestMethod()); // Display response code. System.out.println("Response code is " + hpCon.getResponseCode()); // Display response message. System.out.println("Response Message is " + hpCon.getResponseMessage()); // Get a list of the header fields and a set // of the header keys. Map<String, List<String>> hdrMap = hpCon.getHeaderFields(); Set<String> hdrField = hdrMap.keySet(); System.out.println("\nHere is the header:"); // Display all header keys and values. for(String k : hdrField) { System.out.println("Key: " + k + " Value: " + hdrMap.get(k)); } } } Question is Why hpCon Object is declared in the following way? HttpURLConnection hpCon = (HttpURLConnection) hp.openConnection(); instead of declaring like this HttpURLConnection hpCon = new HttpURLConnection(); Author provided the following explanation. I cant understand Java provides a subclass of URLConnection that provides support for HTTP connections. This class is called HttpURLConnection. You obtain an HttpURLConnection in the same way just shown, by calling openConnection( ) on a URL object, but you must cast the result to HttpURLConnection. (Of course, you must make sure that you are actually opening an HTTP connection.) Once you have obtained a reference to an HttpURLConnection object, you can use any of the methods inherited from URLConnection

    Read the article

  • CodePlex Daily Summary for Monday, October 07, 2013

    CodePlex Daily Summary for Monday, October 07, 2013Popular ReleasesGhostscript.NET: Ghostscript.NET v.1.1.0.: v.1.1.0. added GhostscriptViewer state handling (SaveState, RestoreState) GhostscriptRasterizer constructor is extended in order to support usage of the existing GhostscriptViewer instance. fixed problem while using a 32-bit assembly with 32-bit version of Ghostscript on 64-bit Windows: It couldn't find a registry key of installed Ghostscript. Reported and fixed by "r0land". v.1.0.9. implemented EPS (Encapsulated PostScript) support for the GhostscriptViewer. added GhostscriptRasterize...Ghostscript Studio: Ghostscript.Studio v.1.0.2: Ghostscript Studio is easy to use Ghostscript IDE, a tool that facilitates the use of the Ghostscript interpreter by providing you with a graphical interface for postscript editing and file conversions. Ghostscript Studio allows you to preview postscript files, edit the code and execute them in order to convert PDF documents and other formats. The program allows you to convert between PDF, Postscript, EPS, TIFF, JPG and PNG by using the Ghostscript.NET Processor. v.1.0.2. added custom -c s...cmdradio: v0.1.1 binary: Default download in win32. For other OS see here. This is alpha version. Please report all bugs.Media Companion: Media Companion MC3.579b: Fixed IMDB actor scraping for Movies and TV. Note: there are a couple of new functions that are not active, as this release needed to be done due to IMDB change. New* TV - context menu for rescrapeing Poster image/Banner image Fixed* Both - Fixed actor scraping from IMDB * Movie - Fixed Tableview if Movie's movieset was not in MC's list of moviesets * Movie - Rename with mediainfo now lowercase. * Both - added ignore "A " in titles. Separate option in General Preferences. * Tv - Changing ...VidCoder: 1.5.7 Beta: Updated HandBrake core to SVN 4819. About dialog now pulls down HandBrake version from the DLL. Added a confirmation dialog to Stop if the encode has been going on for more than 5 minutes. Fixed handling of unicode characters for input and output filenames. We now encode to UTF-8 before passing to HandBrake. Fixed a crash in the queue multiple titles dialog. Added code to rescue tool windows which get placed outside of the visible screen area.Wsus Package Publisher: Release v1.3.1310.05: Enhance the "Reboot Remote Computers", by adding a timer before the reboot occure. So that remote users can save their documents and close applications. You can also add a message to be display. In 'Tools'->'Settings'-> Misc Tab, you can set a default message. Enhance the "Compare Computers against AD", by choosing OUs to include in the comparison.Event-Based Components AppBuilder: AB3.Iteration.53: Iteration 53 (Feature): Allow drag&drop of existing component (flow, step) from component list to chart. Duplicate names are automatically recognized and solved. By the color of the draged component you can see what kind of component (flow or step) is currently draged. New: AddExistingComponentFlow, PartDragDropEventHandler, ExistingStepPreparerPulse: Pulse 0.6.7.3: Pulse is now accepting donations. To donate by Bitcoin or PayPal see https://pulse.codeplex.com/wikipage?title=Donations Lots of updates in v0.6.7.3: (Feature) New option allows you to disable wallpaper changing when a full screen application is running. This way Pulse doesn't slow down/lag your videos and games :) (Fix) Some users were getting Wallbase errors when logging in. This has been fixed. (Feature) Right click a provider and you can now make a copy of it by selecting the "Dupl...MoreTerra (Terraria World Viewer): MoreTerra 1.11.1: Release 1.11.1 =========== =Bug Fixes= =========== Added more tile blocks (Clouds, crimstone) Added items (binoculars, rope, Pirahna Gun) Added ores (Lead, Tin) Chests now work, I broke them yesterday. =============== =Known Issues= =============== I am having trouble with new background walls. So you will see a red outline for crimson then a pink inside. Same with where I think the queen bee lives.VG-Ripper & PG-Ripper: PG-Ripper 1.4.19: NEW: Added Option to login as Guest NEW: Added Menu Option to delete an Forum Account NEW: Added Support for "ImageTeam.org links FIXED: Fixed Ripping of http://forum.babeunion.com ForumsSingle Line Diagram (Electrical) Control Library: Single Line Diagram (Electrical) Control Lib - 1.2: In this release of SLD (Electrical) Control Lib I have fix code for the following symbols... 1. Switch 2. Isolator 3. Lightening Arrestor 4. Meter A new symbol is added for "Pothead Terminal" You can give it a try. I will keep improving and posting the new features.State of Decay Save Manager: Version 1.0.4: Add version at bottom of formCollection Commander for Configuration Manager 2012: CMCollCtr_1.0.0.2: Windows Installer (MSI) setup;New Ribbon Toolbar implemented; more Powershell commands...Classic WiX Burn Theme: Classic WiX Burn Theme 1.0: Project Description A WiX Burn theme inspired by the classic WiX wizard user interface.QuickConverter: QuickConverter 0.7.3 Beta: Allows access to external variables from inside lambda expressions.DNN® Form and List: DNN Form and List 06.00.07: DotNetNuke Form and List 06.00.06 Changes to 6.0.7•Fixed an error in datatypes.config that caused calculated fields to be missing in 6.0.6 Changes to 6.0.6•Add in Sql to remove 'text on row' setting for UserDefinedTable to make SQL Azure compatible. •Add new azureCompatible element to manifest. •Added a fix for importing templates. Changes to 6.0.2•Fix: MakeThumbnail was broken if the application pool was configured to .Net 4 •Change: Data is now stored in nvarchar(max) instead of ntext C...Trace Reader for Microsoft Dynamics CRM: Trace Reader (1.2013.10.3): Fix a bug when the first caracter of a description line is '[' Add search featureSimpleExcelReportMaker: Serm 0.03: SourceCode and Sample .Net Framework 3.5 AnyCPU compile.Application Architecture Guidelines: App Architecture Guidelines 3.0.8: This document is an overview of software qualities, principles, patterns, practices, tools and libraries.BlackJumboDog: Ver5.9.6: 2013.09.30 Ver5.9.6 (1)SMTP???????、???????????????? (2)WinAPI??????? (3)Web???????CGI???????????????????????New ProjectsASP.NET and ASP.NET MVC Blog Test: As a recent convert from Windows Forms & WPF to ASP.NET and ASP.NET MVC, I've created a simple project and implemented it twice using ASP.NET and ASP.NET MVC.cmdradio: Simple command-line interface internet radio playerDa Nang College Of Technology: Cao Ð?ng Công Ngh?DNN Camera Slideshow: The Camera Slideshow module for DNNFolder Iterator: Folder Iterator is a program I made one night to iterate through a folder and output an ordered list of files contained within the chosen directory.foobarpoc: fooJamendo Search Rest: This project uses the Jamendo REST API to search songs by title.Lab Nhibernate, PRISM, WPF, FLUENT, C#, ServiceStack, JSON: This is a lab for new technology and solving real developer problem Merch ASP.NET: This project is a port of Merch built on Zend framework 2 / PHP 5.MyTest2: This is test project2.Paintbear: Paintbear is a free open-source Paint Program and Image Manipulation Program for Windows based on the .NET Framework(version4). Works on Linux,too ! With Wine RasterConversion (.Net framework): Comparison of variants for working with raster (Bitmap class) at a low level. .Net framework, C#SSIS Custom Connection Manager: Example code for creating a Custom Connection Manager in SSIS 2008 and 2012Web Scripting Project: This is a project related to web application development at the Universtity of HertfordshireWebscripting1: New Scripting Website for University of Herts course Write.NET: Another .NET Blogging Engine: Write.NET is currently under development. Write.NET will be lightweight and highly configurable blogging engine built in MVC4Yoer: ????ASP.NET MVC??,????????????

    Read the article

  • Remote desktop use two out of four monitors

    - by William Gant
    I've recently upgraded my home workstation and now have four monitors on it. I work remotely most of the time and need some way to get remote desktop onto only two of those four monitors. The top two monitors (monitors 4 & 3, going from left to right) each have a maximum resolution of 1680x1050. The bottom two monitors (1 & 2) each have a maximum resolution of 1920x1080. In my .rpd file for this remote desktop connection, I have the following keys (I've clipped it for brevity) screen mode id:i:2 use multimon:i:1 desktopwidth:i:1920 desktopheight:i:2130 session bpp:i:32 winposstr:s:0,1,3,75,1655,675 Previously I was able to get away with just doing "mstsc /span" when I had only two monitors, but that isn't working now (and isn't desirable). I'd like for the new setup to only use two of my monitors. I don't really care which two. How do I alter the .rdp file to accomplish this?

    Read the article

  • Cyrus: How Do I Configure saslauthd For Authentication?

    - by Nick
    I'm trying to get Cyrus IMAP (v 2.2 on Ubuntu 9.04) setup and working, but I'm having a bit of trouble getting the login working correctly. I've created a mailbox for my test user "nrahl": cm user/nrahl and then created a password: $ saslpasswd2 nrahl I'm attempting to connect to the mailbox using Thunderbird. I'm using the machine's LAN IP address as the host, and "nrahl" as the username. It connects to the server and prompts me for the password. When I enter it, I get "Login to server failed." in Thunderbird, and /var/log/mail.log shows: Apr 15 19:20:01 IMAP cyrus/imap[1930]: accepted connection Apr 15 19:20:09 IMAP cyrus/imap[1930]: badlogin: [192.168.5.21] plaintext nrahl SASL(-13): authentication failure: checkpass failed Part of /etc/imapd.conf with comments removed: sieveusehomedir: false sievedir: /var/spool/sieve #mailnotifier: zephyr #sievenotifier: zephyr #dracinterval: 0 #drachost: localhost hashimapspool: true allowplaintext: yes sasl_mech_list: PLAIN #allowapop: no #sasl_maximum_layer: 256 #loginrealms: example.com #virtdomains: userid #defaultdomain: sasl_pwcheck_method: saslauthd #sasl_auxprop_plugin: sasldb sasl_auto_transition: no UPDATE: When setting: sasl_pwcheck_method: alwaystrue in /etc/imapd.conf, login works correctly. So I'm assuming the issue is saslauthd related.

    Read the article

  • Wifi range issues and intermittent dropouts, Thinkpad Edge

    - by jimbo
    If I am more than a couple of metres from my access point (and I'm seeing this across various APs) with my newish Thinkpad Edge 15, running 10.10, the wifi performance becomes ... flaky. When this is happening, I see the following in dmesg, although I'm not sure if it's related: [ 2497.011099] intel ips 0000:00:1f.6: CPU power or thermal limit exceeded [ 2502.012711] intel ips 0000:00:1f.6: CPU power or thermal limit exceeded [ 2507.009254] intel ips 0000:00:1f.6: CPU power or thermal limit exceeded [ 2512.008367] intel ips 0000:00:1f.6: CPU power or thermal limit exceeded [ 2517.007467] intel ips 0000:00:1f.6: CPU power or thermal limit exceeded [ 2522.006558] intel ips 0000:00:1f.6: CPU power or thermal limit exceeded [ 2527.008157] intel ips 0000:00:1f.6: CPU power or thermal limit exceeded [ 2532.007251] intel ips 0000:00:1f.6: CPU power or thermal limit exceeded [ 2537.003838] intel ips 0000:00:1f.6: CPU power or thermal limit exceeded [ 2542.005427] intel ips 0000:00:1f.6: CPU power or thermal limit exceeded [ 2547.004496] intel ips 0000:00:1f.6: CPU power or thermal limit exceeded [ 2552.003611] intel ips 0000:00:1f.6: CPU power or thermal limit exceeded lspci -vvv has the following to say about my wireless adapter: 03:00.0 Network controller: Intel Corporation Centrino Wireless-N 1000 Subsystem: Intel Corporation Centrino Wireless-N 1000 BGN Control: I/O- Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR+ FastB2B- DisINTx- Status: Cap+ 66MHz- UDF- FastB2B- ParErr- DEVSEL=fast >TAbort- <TAbort- <MAbort- >SERR- <PERR- INTx- Latency: 0, Cache Line Size: 64 bytes Interrupt: pin A routed to IRQ 49 Region 0: Memory at f0500000 (64-bit, non-prefetchable) [size=8K] Capabilities: <access denied> Kernel driver in use: iwlagn Kernel modules: iwlagn If I get within a couple of metres of the access point, I still see that output in dmesg, but the connection stabilises. My question is threefold: how do I get better wifi range, what can/should I do about those messages in dmesg, and most crucially, are the two related? As ever let me know if there's other information that would help! Edit: I am using this machine in exactly the same locations I used my previous Thinkpad (T61) running various older versions of Ubuntu, so I definitely feel there is something wrong, rather me having unreasonable expectations of range!

    Read the article

  • Glusterfs : 'No route to host' for fstab mount in CentOS

    - by son_of_fire
    I am using glusterfs, and am using fstab in this way: <IPADDRESS>:/<VOLUMENAME> /some/mount/point glusterfs defaults,_netdev 0 0 but the logs for the mount continue to say the following. [<TIMESTAMP>] E [socket.c:2161:socket_connect_finish] 0-<VOLUMENAME>-client-1: connection to <IPADDRESS>:24007 failed (No route to host) I know this is not true, since when the system is up and running, I can easily issue a mount and the volume gets mounted. (I've done this by using rc.local) after reading more I have seen that using _netdev is preferred, and that if the host cannot be reached netfs will remount the volume after the network comes up, but that is not happening. (netfs is running). Is there a way to make the mount happen at a different time without using a script? (I would prefer to use fstab to manage the mounting even though I can use a script.)

    Read the article

  • Routing only some local IPs through VPN on dd-wrt

    - by bo-inge-ostberg
    Much similar to this entry: http://serverfault.com/questions/94283/using-dd-wrt-to-connect-to-vpn-and-forward-all-traffic-of-certain-devices-through , I have set up my router with dd-wrt + OpenVPN to connect to a VPN. This works fine, and all traffic from behind the router goes through the VPN. How do I route(?) traffic in the router so that only certain IPs from the LAN will go through the VPN, while the others take the "normal" route? Is it also possible to allow traffic from certain local IPs to go ONLY through the VPN, making it impossible for them to use the regular internet connection if the VPN is down? I know this question was answered in the post I linked to, but that just doesn't seem to work for me. The routing table and rules change, but traffic still just goes through the VPN.

    Read the article

  • creating proper vpn tunnel, when both LANs have the same addressing

    - by meta
    I was following this tutorial http://wiki.debian.org/OpenVPN#TLS-enabled_VPN and this one http://users.telenet.be/mydotcom/howto/linux/openvpn.htm to create openvpn connection to my remote LAN. But both examples assumed that both LANs have different addresses (ie 192.168.10.0/24 and 192.168.20.0/24, check out this image i.stack.imgur.com/2eUSm.png). Unfortunately in my case both local and remote lan have 192.168.1.0/24 addresses. I am able to connect directly on the openvpn server (I can ping it and log in with ssh), but I can't see other devices on the remote LAN (not mentioning accessing them via browser which was the point from the first place). And don't know if the addressing issue may be the reason of that? If not - how to define routes, so I could ping other devices in remote LAN?

    Read the article

  • Load balance incoming traffic

    - by justin
    Dear All Please I have the following scenario. 3 servers voip / mail / terminal one load balancing router 2 internet connections (static ip`s) My concern is to load balance incoming traffic since the outgoing traffic is being taking care by the load balancing router. For instance all offices connect to the mail server via the internet same for voip and terminal services. The mail and voip clients are set up with one of the static ip`s and the router forwards the request to the appropriate server. But obviously like this there is no fail over nor load balancing cause all requests are being directed to one internet connection. Anyone has a suggestion was thing of a dns server, does this make sens ? or maybe a hosted option ? Thanks Justin

    Read the article

  • Centos iptables blocks after reboot

    - by bilal
    I have a Centos5 installation with kvm on my server. I am using nat portforwarding to ssh my virtual machines. I have several iptables rules and saved then in /etc/sysconfig/iptables. After reboot, I see all these rules when I type service iptables status but I am getting a connection refused error. After typing service iptables restart everything works. I don't understand, why do I need to restart iptables again? Doesn't it restart on reboot?

    Read the article

  • Microsoft Word Document Recovery seems unresolvable

    - by LarsTech
    We work on a server (Windows Server 2003 Enterprise) that a bunch of us developers remote in using Remote Desktop Connection, and now every "other" time I open up word I get this Document Recovery pane: As you can see, "Delete" and "Show Repairs" are disabled (I don't think this is my document). "Open" or "Save As" works, but it never marks this document recovered, it just keeps coming up every "other" time I open up word. I don't even care about this document. How can I remove this document from the list so that it doesn't come up anymore? Update: As per Moab's comment, I had the user delete the file. But the Document Recovery still shows up with that item every other time Word is opened. The only difference now is that clicking on "Open" produces this (obvious) message: How can I clear this list? BTW, the user that owned this document was never getting the Document Recovery pane.

    Read the article

  • Internet not working on a VirtualBox Linux machine

    - by mtahmed
    I installed VirtualBox on a Windows 7 PC and installed Xubuntu (Ubuntu with XFCE) on it. Now on the Windows, I am connected to the internet through a LAN which automatically authenticates to connect to the network(because its my work computer). I have also installed the guest additions on this virtual machine (Xubuntu). I tried both NAT and bridging to set up internet connection but I still cannot connect to the internet. For NAT: The browser doesn't say website not found but it keeps trying to load it forever and the network status monitor (the one provided by VirtualBox) doesn't show any green light (no incoming packets). For Bridging: Same as for NAT but the network status monitor shows incoming packets and outgoing packets. But internet still doesn't work. I tried a lot of different things but none worked.

    Read the article

  • Understanding mongod log entries

    - by Jo Erlang
    Can anyone explain the numbers in the % column, in this mongod log snippet ? It is building a new index but I'm not sure I understand why the numbers look like that Tue Nov 20 11:38:08 [initandlisten] connection accepted from 127.0.0.1:49299 #8 (2 connections now open) Tue Nov 20 11:38:08 [conn8] build index xx.yyyy { source: 1 } Tue Nov 20 11:38:19 [conn8] 2921300/243339 1200% Tue Nov 20 11:38:29 [conn8] 4109600/243339 1688% Tue Nov 20 11:38:39 [conn8] 4400100/243339 1808% Tue Nov 20 11:38:49 [conn8] 4676600/243339 1921% Tue Nov 20 11:38:59 [conn8] 4939700/243339 2029% Tue Nov 20 11:39:09 [conn8] 5217800/243339 2144% Tue Nov 20 11:39:19 [conn8] 5439300/243339 2235% Tue Nov 20 11:39:29 [conn8] 5659700/243339 2325% Thanks

    Read the article

  • WebDAV mapped drive asking for username and password

    - by confus3d
    Since we migrated domains we're having problems with mapping a drive using a WebDAV connection in our login script. It's a simple net use x: \\server.domain.com\folder Which used to authenticate automatically (all we needed to do to make this happen was to put the server in the intranet zone in the internet explorer settings). Since the domain migration though, nearly everyone is being prompted for a username and password to connect. Does anyone have any idea how to fix this? Any help much appreciated. The webdav share is on a Windows 2003 server running IIS.

    Read the article

  • How to contact an emergency service using only Internet?

    - by Vi.
    Suppose you are apart from the mobile or whatever phone (so can't call 112 or 911 or 999), but have have access to a computer with internet connection. How do you call/message an emergency service (of whatever country in hope they will route the request to the correct destination) using only Internet? Maybe there's some 911-like website or public SIP or whatever? Or better go to some chat/forum/StackExchange/whatever and ask somebody to make a call for you? (will users really believe?) /* 1. I'm not in any emergency, just curious. 2. I'm not sure on what SE site to ask this question. */

    Read the article

  • FTP an entire folder via the command line

    - by Chris Southam
    I'm looking so migrate some websites to a new server. I have SSH access to the current one but only FTP access to the new one. How can I via Centos and SSH copy entire folders to the new server via FTP? I can log into the new server via the built in FTP client on Centos but I'm not sure of the commands after this. I need to move the httpdocs folder from the old to the public_html folder of the new server. I'd love to do this server to server as it'll be a lot quicker than download - upload via my broadband connection. Yours, Chris

    Read the article

< Previous Page | 460 461 462 463 464 465 466 467 468 469 470 471  | Next Page >