Search Results

Search found 30555 results on 1223 pages for 'closed source'.

Page 198/1223 | < Previous Page | 194 195 196 197 198 199 200 201 202 203 204 205  | Next Page >

  • Please provide how to setup using VMware, AD [closed]

    - by user552585
    In my organisation we have more than 100 pcs and high configured 3 IBM servers. Now the senario is 300 employees with diff programmers like .Net,java,php etc. these employees use by these systems only in diff shifts without stop their work. I want all applications required them on every system and they have perticular id, Pw to login and i have to secure the organisation data and userdata to tamper or any thing by other users. Please provide how to setup using VMware, AD with MicroSoft environment with fully secured manner. please give brief explanation. Please help me

    Read the article

  • Router Suggestions [closed]

    - by zamN
    Lately I've been having router problems with my Linksys WRT54G when it comes to signal strength. Anyways I am thinking of getting a new router since the linksys is about 5 years old now. I'm just curious what router to get because I want to get the most out of my money. I already have a modem that I will connect to the router. I don't need voIP or any of that. All I simply want is just something that is going to last and will not have any stupid problems.

    Read the article

  • Need for J2me source code

    - by tikamchandrakar
    For J2me It strikes me as odd that you need an extra "api key" and so on. But actually, what I really want is NOT create an extra facebook application that needs to be registered on Facebook. I don't want to create any extra configuration effords necessary for the user of my application to undergo. All my user should need is his well-known login data for facebook. Everything else should be completely transparent to him. So, I thought maybe would u can do the login process, creating a request to the REST server via http. I know this would provide me with an XML. I hope that the this API will somehow automatically transform that XML into an intuitive object model that represents the facebook user data of the respective user. So, I would expect something like userData = new FacebookData(new FacebookConnection("user_name", "password")). Done. If you get, what I mean. No api key. No secret key. Just the well-known login data. Practically, the equivalent to thunderbird webmail, which allows you to access your MSN hotmail account via Thunderbird. Thunderbird webmail will automatically converts the htmls obtained from a hotmail browser login into the data structure usually passed on to a mail client. Hope you get what I mean. I was expecting the equilalent for the your API.

    Read the article

  • Server Design Enquiry [closed]

    - by Brandon Gelfand
    I am really new to this and am having trouble with how I can store information from a website. I am going to be making a site similar to dropbox for my company and there is gonna be around 40 TB of space that we need to be able to access on the go, for instance pull the file up on the phone or off our laptop using the internet. Obviously using an Amazon S3 server is not the most cost effective way of doing this so how can I make my own server to hold all of the info and communicate to it from a website? Please I need as much help as possible, like a blueprint or something. Thanks and sorry for the noobish question but I can't really find an exact answer to my question.

    Read the article

  • FTP: check whether file is closed

    - by Andrey
    Hello! My FTP client (apache commons ftpclient) needs to download file from FTP. The problem is that an external service may not have finished writing to this file before I start downloading. Is there any way to determine via FTP whether the file is already closed (i.e. there is no writing handlers opened)? The problem is that I cannot synchronize with that external service. But I mustn't download file which is not finished yet (a large JPEG). Thanks in advance!

    Read the article

  • Who should own /var/www? [closed]

    - by John
    Possible Duplicate: How should I structure my users/groups/permissions for a web server? I've seen a few answers to this on the internet, but I'm looking for a definitive answer. I have a new Ubuntu 12.04 LTS server with LAMP. Apache is set to run as "www-data" and /var/www is set as having "root" as the owner and "root" as the group. The permissions for /var/www are "drwxr-xr-x" which I believe translates to 755 numerically. I know that /var/www should not be owned by "www-data" because then buggy/malicious code could have a field day. However, should I keep it as root:root (inconvenient) or should I change it to ubuntu:ubuntu, the default user that Ubuntu preconfigures for you to log in with? Should the permissions remain at 755? I've been administrating systems for a while with no big security issues, but I'm trying to get really serious about security, double-check everything, and make sure that there are no gaps in my knowledge.

    Read the article

  • I've lost everything [closed]

    - by Melissa
    Possible Duplicate: Recover data loss from accidental quick format My husband downloaded windows 7 to my computer (previously vista) and in the process, deleted everything I had on my computer. He did it without telling me, so nothing was backed up before the transfer. Now, all of my documents, and my entire itunes library are gone. Please tell me there's something I can do. I downloaded itunes again, but nothing is there. Help!

    Read the article

  • best tool for searching within unstructured log files [closed]

    - by Alex Holding
    i am supporting a number of bespoke applications at the minute and searching through their very non standard logs is a nightmare, so im looking for a tool which can do the following - load large text files search through multiple files at once and display all results can search with regex can be used to view and search unstructured text files There are some great looking log tools available but they all seem to be focused on structured logs, where the logs i deal with most days are just flat text files. I am currently using notepad++ but that has its own annoyances so im hoping there is a dedicated log analysis tool that i havent found yet.

    Read the article

  • a moderator closed my question is any one watching. [closed]

    - by Registered User
    I do not have requisite previlieges to post many links to my questions my genuine question was blocked be this sites moderator is any one watching. The internal IPs of apache vhost configuration file which I was posting were treaated as links using apache as a front end to Tomcat application Moderators should behave more sensibly.I am new to this forum.How do you say the question is not real when your forum is not allowing me to post links to snapshots so that some one can understand what I asked.

    Read the article

  • Trying to find a good filehost [closed]

    - by user67481
    I'm looking for a good filehost that I can use to link downloads on my blog (personally created files, no copyright infringement). Been looking at mediafire, but I'm not sure what else is out there that would meet my needs. Ideally wanting something that has no files-per-day-per-user limits, can host individual files of at least 500MB each, and has very little hassle for the users who download from them. I'll pay for a 'premium' or whatever level account if necessary. Any good suggestions? Or will mediafire be my best bet for this?

    Read the article

  • SAN for Medium Business - Where to start? [closed]

    - by Henson
    I've always run Linux on my home computers, and done PC repair for years, but this is my first experience with needing to buy a SAN. I thought I was knowledgeable, but I feel a bit lost. I need to be able to support 25 VMs, which are currently managed through vSphere. The company I'm at is growing quickly though, so I'd like to plan for the future. Ideally, I want a solution that I can just tack arrays onto and manage as one large, iSCSI drive. Suggestions? Good resources? If I can find something that appears to software as one large drive, am I better off going with a solution like FreeNAS or Starwind, or an all-in-one proprietary solution like NetApp? Cost, is (of course, and always I'm sure) an issue.

    Read the article

  • Source code repository portal

    - by gmoorevt
    I am looking for a "portal" for wrapping our large Subversion repositories similar to GitHub. Does anyone know of any options? Features we are looking for are; home page for projects that include wiki bug tracking etc. This would be for an internal deployment.

    Read the article

  • shell script or command to search and replace [closed]

    - by Redbox
    Possible Duplicate: My server’s been hacked EMERGENCY lately website on my server has been infected with nasty javascript like this: http://pastebin.com/7XCidF6C i wonder is there any where to search and remove the entire script block? i only know how to search which files: find /home/loudcom/public_html/tv -iname '.' | xargs grep --color 'f1930e\|fff309' how do i apply sed or any other command to replace the entire block of nasty code to empty? im using Centos 6.

    Read the article

  • Hardware needed for 2000 users? [closed]

    - by Trcx
    I have school assignment that is fairly well defined, requiring us to come up with a plan for an environment serving dynamic web applications to 2000 users, and should be able to scale up to six thousand. I have done plenty of research as far as load balancing, redundancy, UPSs, etc, but am having a hard time figuring out how much hardware is actually needed in the way of physical servers, ram, processing power, etc. The assignment states that the server will have a lot of dynamic code, email, and a database are required, all utilizing the appropriate microsoft service (MS SQL, Exchange, IIS). I already plan on splitting them out on to separate servers, but can't even fathom the hardware requirements of something that large scale. Could someone with experience weight in on this, or point me two some good articles?

    Read the article

  • What do desktop users use SSD for? [closed]

    - by Continuation
    I keep hearing people say how much faster their desktops/laptops are after switching to SSD. But then when I ask them what do they use SSD for, the answers are always "booting". Not trying to start a flame war here. I use SSD for my database server and it makes a huge difference. A single SSD can replace a 10 drives RAID 10 and is actually both much faster and MUCH CHEAPER. But I just can't think of a usecase for desktops where SSD could make a similar impact. Sure it's kinda nice to cut boot time down from 40 seconds to 10 seconds. But is it that big a deal? Would love to hear how SSD improves your desktop performance.

    Read the article

  • [SOLVED]Can't enable mysql and mysqli extension in PHP [closed]

    - by Sydcul
    I used to had my website hosted at a hosting company. But i decided to start my own webserver and in order to get phpBB, MediaWiki, etc. working i need PHP and MySQL. So after a bit of screwing around i could get those working but the MySQL and MySQLi extensions do not seem to work. When i use phpMyAdmin, phpBB, whatever it would say it is not installed correctly. I uncommented it in php.ini, i put my PHP folder in path, still not working. Please note i am not a PHP developer at all. Using: Windows Server 2003 Small Buisiness (too lazy to install Linux) Apache2 (not sure what version) PHP 5.2 (threaded, manually installed) MySQL 5.5.28 Thanks in advance, -Sydcul EDIT: Solved. Don't know how, just used the installer and it worked.

    Read the article

  • Webserver - Memory-bound or CPU-bound? [closed]

    - by JJP
    Possible Duplicate: How do you do Load Testing and Capacity Planning for Web Sites I'm installing a social network using Zend Framework & MySql, with lots of plugins & queries. I want Webserver & Sql server on one box. I'm trying to choose between two machines (on hetzner.de): A) intel i7-2600 3.4 GHz 16 GB DDR3 RAM B) intel i7-920 2.6 GHz 24 GB DDR3 RAM B has 50% more RAM but 30% slower clock speed. Q is: is it obvious where the bottleneck will be? Would I ever need 24GB of RAM, even with lots of concurrent users?

    Read the article

  • Set div innerhtml to source that contains table

    - by lolla
    Hi, I thought what I was trying to do is quite simple, but apparently nothing related to IE is ever simple. I'm using this with javascript and ajax - document.getElementById("calender").innerText=mypostrequest.responseText it works fine in ff and IE7, but not IE8. I suspect it's because the text contains a table, since I have tested it with other text. I cant replace the table. Is there any way to get around this?

    Read the article

  • Copy past speed very slow for a large number of files on Windows [closed]

    - by Arno2501
    I've run the following test I've created a folder containing 15'000 files of 400 bytes using this batch : @ECHO off SET times=15000 FOR /L %%i IN (1,1,%times%) DO ( fsutil file createnew filename%%i.txt 400 ) then I copy past it on my Windows Computer using this command : robocopy LargeNumberOfFiles\ LargeNumberOfFiles2\ After it has completed I can see that the transfer rate was 915810 Bytes/sec this is less than 1 MB/s. It took me several seconds to copy 7 MBytes Please note that this is very slow. I've tried the same with a folder with a single file of 50 Mbytes and the transfer rate is 1219512195 Bytes/sec. (yeah GB/s) instantaneous. Why copying large number of files take so much time - ressources on a windows filesystem ? Please note that I've tried to do the same on a linux system which runs on the same computer in a virtual machine (vmware player) with ext3 filesystem. I use the cp command and the copy is instantaneous ! Please also note the following : no antivirus I've tested that behaviour on multiple windows computers (always ntfs) i always get comparable results (transfer rate under 1MB/s avg 7-8 seconds to copy 7 MBytes) I've tested on multiple linux ext3 system the copy is always instantaneous for that amount (15000 files of 400 bytes) The question is about understanding what makes windows filesystem so slow to copy large number of files compared to a linux one for instance.

    Read the article

  • Memory Speeds: 1x4GB or 2x2GB? [closed]

    - by Dasutin
    When it comes to speeds what is faster having one 4GB module in your system or having two 2GB modules. I'm not taking in the fact that the system could have dual channel capabilities. Also what about a server environment? Would it be better to have one large, high density module or break it up into several modules for speed and price? I heard an engineer at my office having a discussion with an employee. He said that its better in all situations to have one large capacity modules instead of breaking it up. It would be cheaper and perform faster. He also said it would take longer for the computer to access what it needed if there were more modules instead of having just one. His explanation didn't seem right to me and I thought I would post this question here to see what other people thought.

    Read the article

  • What is the best plan to handle server fault for google app engine [closed]

    - by lucemia
    I used google-appengine without preparing much backup plans before, but it looks like not a good idea anymore.... Since google app engine is quite hard to find a backup replacement, I plan to just add a "server error" page which will show while server fault. Currently I am thinking to: Use the cdn cloudfare in front of google app engine. It will also handle the NAME server for me. Prepare some static version of webpages (such as "Oops! the server fault") in another hosting platform While google app engine failed, I will switch the destination from google app engine to the static page by change the CNAME records on cloudfare. Is there any other recommand way to solve this situation?

    Read the article

< Previous Page | 194 195 196 197 198 199 200 201 202 203 204 205  | Next Page >