Search Results

Search found 40998 results on 1640 pages for 'project setup'.

Page 495/1640 | < Previous Page | 491 492 493 494 495 496 497 498 499 500 501 502  | Next Page >

  • Why is Server 2012 assigning "169.254.*.*" series when creating DHCP server?

    - by Seth
    I have a small office, with ATT Motorola modem (192.168.1.254) set as passthrough to Dlink DIR-815 (LAN 192.168.0.1) I am trying to setup DHCP server on Server 2012, and when I create new DHCP server, the title is created as 169.254.. instead of the domain name. (Domain clients can retrieve IP's as defined in the scope) Non-domain clients are not receiving IP's from the server but rather the Motorola... How do I assure DHCP setup is properly creating itself, and how do I make sure domain and non-domain clients get IP's from the server?

    Read the article

  • git, egit, submodules, and symlinks -- how should shared sub-projects be handled in eclipse?

    - by Autophil
    Here's the situation. I have a few git projects with a directory structure layed out more or less like this: simpleproj app www admin demo lib model orm view model user blah ... storeproj app www about mobile fbapp lib model orm view model user message cart product merchant Each directory in "lib" contains a separate project, either created in-house or forked, all of which use git for source control. So I figured I should make them submodules of my projects, right? Well, we've been moving toward eclipse + egit, because some of our windows guys not used to a CLI need something they can use without being scared of screwing things up. Anyway, the problem is, egit doesn't support submodules. So, my solution has been a rather crude one involving symlinks... lets say my directory structure on my dev box is generally layed out like this: ~/projects/ bigproj .git app lib model (- ~/lib/model/src/) orm (- ~/lib/orm/src/) neatproj .git app lib view (- ~/lib/view/src/) oldproj .git app lib orm (- ~/lib/orm/src/) ~/lib/ model .git src README.md orm .git src COPYING view .git src ...the symlinks link inside of directory with the git repo, so eclipse doesn't get confused, and everything sort of works. On my machine, I can update the libs from anywhere and all projects will be updated (needing to be committed again of course). Each project stores a separate copy of the contents of the symlinked directories within "lib" -- but only when staged from within eclipse. After committing from eclipse and moving back to the CLI, git sees that a bunch of files have been removed and a few symlinks have been created. Of course this is acceptable also, probably more so than keeping a separate history of the libs for each project... but eclipse and CLI git obviously need to be on the same page so tons of files aren't vanishing and reappearing. So this brings me to my question. I'd like to know how to either: get eclipse+egit to see the symlinks as symlinks if git will somehow handle them properly*, or get the CLI git to treat them as non-symlinks. Or, if there's a better way to do this, I'm all ears. Hope this all made sense! :D Note: tried to tag this as git-submodules, but was not allowed :( * should I make them relative or absolute? Either way it's a mess. Also will symlinks will work on windows? i know there's something similar but you need a 3rd party tool to manage them AFAIK, i doubt these would translate well.

    Read the article

  • Testing a Virtualisation of a Debian Server (vmWare vSphere probably)

    - by xyza
    I'm soon getting access to a powerful root-server (quad-core, 16gb ram, 1gbit connection) where gameservers (like minecraft,counterstrike etc.) for different customers should be setup. My plan is to use programs such as vmWare vSphere to create some virtual machines for each customer. Inside such a virtual machine I'll setup the gameserver and maybe some kind of ftp server when its needed. Now that I'm kinda new to virtualisation of servers I want to test this local on my Desktop Computer. Is it possible to create a virtual machine of debian using vmWare Player on my Windows desktop computer and then install vmware vSphere in this VM to create multiple VM's inside that VM ? Or do I really need to install Debian on my desktop computer. (I want to use the time during installations etc. to work a bit at my windows installation) Some tips on virtualising debian servers are also appreciated :)

    Read the article

  • windows cache not working as it should?

    - by piotrektt
    I run windows 2012 server with data center. The setup is with 60GB of RAM. I have one file shared on VHD and when I copy this file locally the RAM cache is all used up but when multiple computers connect to the share it the cache is not used. The network is 8Gb. The whole network is around 200 computers that need to read that one file but on this setup only 10 connection kills the server. Is there any way to check what is going on? What other solution can I use to manage cache in windows?

    Read the article

  • Where to find a list of online TV/video/Webcam sources ?

    - by Frank
    I know there are lots of web sites that offer online TV/Stream viewing, such as : http://tvunetworks.com , http://www.hulu.com/ and more, but the source of their streams are usually well hidden, I wonder if there is any open source project that collects the online TV/video/Webcam sources so that TV stations and individuals can publicly list their stream source in the following format, you can copy the urls below into a browser and start watching : Greek TV|mms://eu02.egihosting.com/938657?MSWMExt=.asf Turkish TV|http://www.bizidinle.com/player/SAlone.asp?id=7 Even if there is no public open source project, is there any where that I can find such a list so that I can get to the stream urls ?

    Read the article

  • Upgrading to x64 results in HTTP 500

    - by Dour High Arch
    I upgraded my development machine to 64-bit Win7, and now when I try to connect to a local ASP.Net project I get: HTTP Error 500 ... Calling LoadLibraryEx on ISAPI filter "C:\Windows\Microsoft.NET\Framework\v4.0.30319\aspnet_filter.dll" failed There are several puzzling things about this; the ASP.Net project was a .Net 2.0 ASMX so it was using C:\Windows\Microsoft.NET\Framework\v2.0.50727. If it updated to x64 without asking me, should it not use C:\Windows\Microsoft.NET\Framework64\v2.0.50727\? Where is C:\Windows\Microsoft.NET\Framework\v4.0.30319\ coming from? I opened IIS Manager and the selected .Net Framework version for my machine is 2.0.50727. Framework version for my default application pool is the same. I am developing in VS2008, which does not even have an option for targeting .Net 4.0.

    Read the article

  • Limit user to one CPU core in windows 7

    - by Dean Ward
    I let my family members use my pc over rdp to play their flash-based games as their laptops overheat if they use them directly. I have it setup so I can use the pc at the same time as them. The pc has a quad core cpu and I would like to be able to assign one of those cores to the user logged in via RDP so that the other 3 cores are left alone. Is this possible? They login via a specific user account setup for the purpose. Thanks for any advice!

    Read the article

  • Network Role based routing

    - by Steve Butler
    Apologies my networking skills are a tad rusty. I'm looking for a way to setup a system that gives me the ability to setup Role-based access to specific network resources. For example, i have three private subnets for specific groups, users will need access to one one or more subnets. I'd like to have all client machines on the same subnet/vlan, and then use 802.1x to authorize into a router(NAC device/whatever), the router would then see what user had authenticated(huge plus if it could determine AD group), and then allow routing to one or more of the three private subnets based upon their group membership. I've looked at packetFence, and it appears to work by assigning a client to a VLAN, but i'd still need a way to route some users into different back-end networks.

    Read the article

  • VPN Connection Causes Internal LAN Connection Loss with Server

    - by sleepisfortheweak
    I've tried configuring basic PPTP VPN at my small business using a number of different tutorials. As far as I can tell, the actual VPN connection worked fine, but upon connecting a client, the Server 'disappears' from the internal LAN. The RRAS service must be stopped before the connection is restored. My Setup: The network is simply a DSL Gateway/Router to the outside functioning as NAT/Firewall/DHCP. The server is a Win Server 2008 machine at fixed IP 192.168.1.200. The server has 1 NIC, so I used the 'custom' option when configuring RRAS. The RRAS settings should be default except that I've disabled ports for connection types I'm not using and reduced PPTP ports to 10. I've also created an address pool and disabled DHCP packet forwarding. The server only functions as a File Share and now a VPN Server. Local LAN computers all have mapped network shares to the server authenticated based on Local User/Group setup on the server. The Problem: The moment a client connects through VPN, the server 'disappears' from the local network. All mapped drives disconnect and there is no response to a ping 192.168.1.200. Even if the client disconnects, the server does not re-appear at that address until the RRAS service is stopped. I've Tried: Using an Address Pool inside and outside the local subnet. Using DCHP Relay Checking Inbound/Outbound filters (none enabled) The fact that nothing I've tried has had any effect, and that I can connect and successfully obtain an IP tells me that it's something more fundamental I'm missing. My gut tells me that it's something to do with the second IP address added by the VPN client somehow taking over the interface or traffic from the local LAN accidently getting routed to the VPN client instead of handled at the server once RRAS has become 'active' when a client connects. Hopefully this may be obvious to someone with real IT experience. I've been doing this a while and almost never been stumped. I'm starting to think it might actually be something tricky since my setup is pretty basic yet refuses to work. I'll be happy to include more info if this doesn't ring any bells right away for anyone. Thanks

    Read the article

  • Will NTP work in an isolated network ( in absence of a reliable time source)

    - by Anand
    Hi, I am investigatiing a typical NTP problem. The setup is as follows :- FreeBSD is being compiled and run on Opensolaris. The config file on OpenSolaris has entry of linux and another opensolaris machine as server and these server machines are syncing time with themselves (local clock) only. The server machines in this case have NTP running on them as well. Within a few minutes of starting the ntp daemon ,client starts syncing time with itself only and remains in that situation after that.All servers are discarded and no time syncing is done with them. My question is , is there any fundamental problem with this setup. Will the NTP work in such isloated network that has no direct or indirect connection with reliable internet time source ?

    Read the article

  • Rails /tmp/cache/assets permissions issue using Debian virtual machine hosted on OS X Lion

    - by Jim
    I am running Parallels Desktop 7 on OS X Lion. I have a VM with Debian installed, and inside that VM I setup a Rails development environment. I am using Parallels Tools to share out my OS X home directory to the VM - the goal here is to run the Rails server on the VM, but host the files on OS X (so they are automatically backed up, and so I can use tools like Textmate to develop with). Everything seems to work with the shared directory - my Debian user can read, write, and execute files. However, when I cloned a recent Rails project from Git, I got an error message when it tried to compile the CSS assets. My symptoms are exactly the same as in the question: http://stackoverflow.com/questions/7556774/rails-sprocket-error-compiling-css-assest-chown-issue I believe this is permissions-based, but it is really weird. My entire Rails project directory has permissions set to 777 and my Debian user owns it. If I navigate into /tmp/cache/assets, those permissions are the same. However, the three-character directories Rails is creating (DCE, DA1, D05, etc...) are being created without write permissions! If I refresh the Rails page a few times, about 4 or 5 (with Rails creating new three-character directories every time), eventually it will create one of the directories with the proper 777 permissions and everything will work! This will persist until I make a change to the CSS files and it has to recompile. Does anyone have any idea what might be going on here? I can't fathom why it is creating temp directories with incorrect permissions, or why after a few refreshes the good permissions kick in and it works... It definitely seems to be an issue with the share, since if I move the project into a different directory on the VM, it seems to work fine. On the OS X side, I've given the shared folder 777 permissions as well, but no dice...any ideas? Update I've found that the number of times I need to refresh before it works is not random - it has to do with how many assets are being compiled. For example, if I edit one of my CSS files, and there are four CSS files in the app/assets/stylesheets directory, I have to refresh four times before the app will finally work without the operation not permitted error...

    Read the article

  • How do I get a Dell Latitude e6420 working?

    - by David_G
    I've just installed Ubuntu 12.04 (64-bit) on a brand new Dell Latitude e6420, and I'm having a few problems. This laptop has an Optimus (?) setup - i.e. integrated gfx and an Nvidia Quadro NVS 4200M. First problem - I ran setup, etc, and discovered that I can only run unity2d - If I try and login with unity3d, it just defaults to 2d. This is with nvidia-current installed (302.07). Note also that I can't run nvidia-settings ("You do not appear to be using the NVIDIA X driver."), and there is no additional drivers found ("No proprietary drivers are in use on this system"). I tried to troubleshoot this, and removed nvidia, leaving (I guess) just Nouveau drivers - In that case, unity3d did work, but I was stuck with the open source Nouveau drivers powering the integrated graphics. So, obviously, I want to run unity3d, and be using the more powerful Nvidia graphics card. I've tried a bit of tinkering around, but I'm not sure the best way to proceed, or perhaps more importantly, I'm not sure of what the best final solution might be. I've heard about bumblebee - but frankly, I would prefer to have the proprietary Nvidia drivers working properly. Any help would be much appreciated!

    Read the article

  • What is the R Language?

    - by TATWORTH
    I encountered the R Language recently with O'Reilly books and while from the context I knew it was a language for dealing with statistics, doing a web search for the support web site was futile. However I have now located the web site and it is at http://www.r-project.org/R is a free language available for a number of platforms including windows. CRAN mirrors are available at a number of locations worldwide.Here is the official description:"R is a language and environment for statistical computing and graphics. It is a GNU project which is similar to the S language and environment which was developed at Bell Laboratories (formerly AT&T, now Lucent Technologies) by John Chambers and colleagues. R can be considered as a different implementation of S. There are some important differences, but much code written for S runs unaltered under R. R provides a wide variety of statistical (linear and nonlinear modelling, classical statistical tests, time-series analysis, classification, clustering, ...) and graphical techniques, and is highly extensible. The S language is often the vehicle of choice for research in statistical methodology, and R provides an Open Source route to participation in that activity. One of R's strengths is the ease with which well-designed publication-quality plots can be produced, including mathematical symbols and formulae where needed. Great care has been taken over the defaults for the minor design choices in graphics, but the user retains full control. R is available as Free Software under the terms of the Free Software Foundation's GNU General Public License in source code form. It compiles and runs on a wide variety of UNIX platforms and similar systems (including FreeBSD and Linux), Windows and MacOS."

    Read the article

  • Material to use for computer system cover against UV and salty air?

    - by hippietrail
    I live right next to the sea and have a large window quite close to my computer setup which allows a lot of indirect sunlight to enter. I'd like to buy or make a cover for my computer system. From visiting my usual mom & pop computer shop yesterday I got the impression these might not really exist any more. If I make my own I need a material with these qualities: Block or reduce ultraviolet light which can depolymerize plastics (the sun here in Australia is much stronger than in the northern hemisphere). Block salt-laden sea air which can oxidize USB and other connectors. Not cause static electricity when covering or uncovering. Keep dust off of course (-: My setup is a laptop plugged into a wide-screen LCD with a few external drives. So I think I'd want a largish sheet to flop over the whole desk. Are such covers commonly sold these days? What material(s) should I look for which provides the listed attributes?

    Read the article

  • How to connect to my US network overseas via VPN?

    - by GiH
    I purchased an Apple TV for my parents and I have a netflix account. My parents live overseas, and I was wondering if they could use my account to get it to work. I read that it won't work unless you use proxies or a VPN, so I was wondering if its possible for me to setup a VPN to my network in the US instead of paying a service like StrongVPN? Setup: Router in US - Airport Extreme Router abroad - D-link (not sure of model) I know that the AppleTV doesn't have a built-in VPN client, maybe eventually when its jailbroken there will be an app, but as of now I'll have to use the routers right? Any other ideas are welcome as well!

    Read the article

  • Run command before and after printing with CUPS?

    - by leto
    Hello, this is a home setup. A central printer server (Linux) manages the queue, a HP 2430DTN is attached to it via 100Mbit/sec Ethernet. The printer is hooked up to a managable power source. A shell script watches the queue on the server (lpstat -o) and turn on the printer when there is a job. If the queue is empty for 10 minutes it turns the printer off. Now this setup messes up, stops the printer etc. after a couple of weeks and is in general "not so reliable". I now know how to change the stop-printer thing, but: Is there a way to run my turn printer on script and turn printer off script directly from cups without watching the queue? That would be so cool!

    Read the article

  • Server 2003 Functional Domain DFS Replication Problem (Files being moved to conflicted folder for no reason)

    - by Az
    We have 2 Windows 2003 servers configured with a DFS namespace and we are running into problems with the redirected profiles we have setup. Basically, one server is the FSMO master for all roles, and we have another DC that is the DFS namespace primary server. We have profile redirection setup using the \dfsnamespace\userprofile formula. The FSMO master DC locks up occasionally (don't ask :), and when it does, and we bring it back up... All of the user profiles hosted on the DFS namespace get overwritten when a user logs in. The current profile gets moved to the conflicting and deleted items folder. This strikes me as really odd considering the whole point of using DFS was to provide some redundancy in case one server went down. Can anyone help? Thanks in advance! -Nate

    Read the article

  • how to have publishing, blog and wiki features together?

    - by George2
    Hello everyone, I am using SharePoint 2007 Enterprise + Publishing portal template + Windows Server 2008. I want to have blog and wiki features as well as publishing portal features. Any ideas how to integrate publishing portal, blog and wiki? For integrate, I mean using the same user name and password to pass through authentication of publishing portal, blog and wiki. And should I setup 3 different site collections for publishing portal, blog and wiki (I find if I setup publishing portal site collection, I can not create blog and wiki sub-site)? thanks in advance, George

    Read the article

  • Copy Ubuntu distro with all settings from one computer to a different one

    - by theFisher86
    I'd like to copy my exact setup from my computer at work to my computer at home. I'm trying to figure out how to go about doing that. So far I've figured this much out. On the source computer run dpkg --get-selections > installed-software and backup the installed-software file Backup /etc/apt/sources.list Backup /usr/share/applications/ to save all my custom Quicklists Backup /etc/fstab to save all my network mounts Backup /usr/share/themes/ to save the customization I've done to my themes I'm also going to backup my entire HOME directory. Once I get to the destination computer I'm going to first do just a fresh install of 11.10 Then I'll copy over my HOME directory, /etc/apt/sources.list, /usr/share/appications, /etc/fstab and /usr/share/themes/ Then I'm going to run dpkg --set-selections < installed-software Followed by dselect That should install all of my apps for me. I'm wondering if there's a way/need to backup dconf and gconf settings from the source computer? I guess that's my ultimate question. I'd also like any notes on anything else that might need backed up as well before I undertake this project. I hope this post is legit, I figured other people would be interested in knowing this process and I don't see any other questions that seem to really document this on here. I'd also like to further this project and have each computer routinely backup all the necessary files so that both computer are basically identical at all times. That's stage 2 though...

    Read the article

  • Retrofit WebForms with ASP.NET MVC - NoVa Code Camp 2010.2 Demo

    - by Soe Tun
    Thank you to everyone who attended my Retrofit WebForms with ASP.NET MVC session at NoVa Code 2010.2. It was a fun event for me and I hope you had a great time and learned something from it. I wish I had more time to go over some more important topics in more detail. I *promise* I will be writing blog post series about it since I'll have some vacation time during the December holidays to cover some topics that I didn't get to cover in detail.   Please note that the ".bak" file included in the zip file is a SQL Server Database backup file. You have to restore it on your Database server to run it with the source code demo.   Please feel free to ask me about the demo project through Twitter or from this blog post. I'll be glad to help you out. If you want me to give this presentation at your .NET User Group, please let me know and I'll be honored to speak there also.   Again, thank you all and have a great holiday season. Here is the download link to my Demo project Zip file with the PowerPoint presentation in it. Please let me know if the link doesn't work.

    Read the article

  • Need help with ColdFusion and ASP.NET site [closed]

    - by Michael Stone
    To begin, I wasn't too sure how to title this.. I've got a few questions. First off, I've got a very big site that's in ColdFusion and we've been migrating to ASP.NET C# 4.0 the last 8 months. I've got a team of 7 programmers and no one can seem to figure out these answers, not even our senior C# programmer. We're using Team Foundation Server and we can't figure out how to only push up one small change at time. Right now we're stuck to publishing the entire site and it's causing serious issues. We've currently got the site as a Project and not a Website. We're wondering if that's one issue. I actually think it might be a problem. We're also dealing with an issue where we can't access our regular folders with relative paths. So we're first developing our admin side in .NET and We've got our regular site and then we've got another site within that for our .NET admin tools. By site, I'm referring to them actually being Sites in IIS. This also creates a problem for us when we're creating tools that upload images and want to store them and access them from our parent Site. I'd very much appreciate any advice on how to go about this in the most standardized way. So what I'm hoping for is advise on: -Publishing and managing a site/project in Team Foundation Server. Being able to push up one fix at a time if needed would be GREAT! -Any help figuring out the issuing referencing folders from my .NET child site to my parent ColdFusion site using regular relative paths. "/a/images/b/" would be nice nice instead of only being able to do "/b/images/" We're using ColdFusion 8, C# Asp.NET 4.0/Entity Framework/POCO Templates, and a Windows 2008 R2 Server. Thank you in advance for any help.

    Read the article

  • What's the best way to manage reusable classes/libraries separately?

    - by Tom
    When coding, I naturally often come up with classes or a set of classes with a high reusability. I'm looking for an easy, straight-forward way to work on them separately. I'd like to be able to easily integrate them into any project; it also should be possible to switch to a different version with as few commands as possible. Am I right with the assumption that git (or another VCS) is best suited for this? I thought of setting up local repositories for each class/project/library/plugin and then just cloning/pulling them. It would be great if I could reference those projects by name, not by the full path. Like git clone someproject. edit: To clarify, I know what VCS are about and I do use them. I'm just looking for a comfortable way to store and edit some reusable pieces of code (including unit tests) separately and to be able to include them (without the unit tests) in other projects, without having to manually copy files. Apache Maven is a good example, but I'm looking for a language-independent solution, optimally command-line-based.

    Read the article

  • Yum through http proxy

    - by eodchop
    Hello, I have several Fedora 13 servers that have to connect through an http proxy for yum updates. All port 80 traffic has to be routed through this proxy. I have setup the proxy server in the network settings GUI. I can browse the internet just fine. I have also setup my proxy information in /etc/yum.conf as follows: proxy=http:proxy.largecorp.corp/accelerated_pac_base.pac proxy_user=user proxy_password=password I then added the export HTTP_PROXY="http:proxy.largecorp.corp/accelerated_pac_base.pac" to /etc/bashrc and sourced the file. When i run yum update: Loaded plugins:presto, refresh-packagekit Error: Cannot retrieve repository metadata (repomd.xml) fro repository: fedora. Please verify its path and try again. All of the repo urls are the defaults, as this is a fresh install.

    Read the article

  • Clutter for game GUI

    - by tjameson
    I'm pretty new to game development, having only written a simple 3d game for a class project, but I'd like to get started on a bigger project. I'm writing an MMORPG to run in both the browser (WebGL) and natively (OpenGL ES 2). In choosing a GUI toolkit, I'm trying to find a style that work work natively and would be simple to emulate in WebGL. I am considering using D or Go for writing my game, so interfacing with C++ libraries will be difficult, if not impossible. Of course, the language isn't the end goal here, so if using C++ will save considerable time, I'll bite the bullet and use that. In order to reduce the amount of code I'll have to write for the browser, I'm considering using something simple like Clutter for basic abstractions, which I think will be pretty easy to emulate (layered canvases maybe?). Does anyone have experience using Clutter for a 3d game? Note: I haven't used any game development libraries, and I only have limited experience with GUI libraries. I do have HTML+CSS experience, so maybe librocket is a viable solution?

    Read the article

  • My WD Caviar Green SATA HDD cannot be detected

    - by obhasha
    I have a WD 2TB Caviar Green SATA HDD. I used it as an external hard drive formatted to FAT32. The main purpose was to store my PS3 games. It worked fine and all of a sudden my PS3 stopped detecting it. I tried connecting it to my pc and the same result. I took it off from the enclosure and plugged it directly to my PC and used the BIOS setup. The BIOS setup detects my HDD but it says it's 0.0MB!!! I also tried the WD Data Lifeguard Diagnostic tool and it wasn't much of use because the drive is not picking up by windows at all. Is my HDD dead? Is it beyond repair?

    Read the article

< Previous Page | 491 492 493 494 495 496 497 498 499 500 501 502  | Next Page >