Search Results

Search found 11627 results on 466 pages for 'market share'.

Page 122/466 | < Previous Page | 118 119 120 121 122 123 124 125 126 127 128 129  | Next Page >

  • Simulating an UNC path with a leading dot

    - by Uwe Keim
    Being a C# .NET Windows Forms developer, some customers are running our applications on an Apple OS X Mac inside a Parallels virtual machine. Parallels presents host folders to the guest Windows as UNC paths with a leading dot like \\.psf\Home\Some\More\Folders Now an application of us cannot handle the leading dot correctly when accessing files from these kind of shares ("Invalid URI, cannot analyze host name" exception). I want to debug and fix this issue, unfortunately I do have no Mac and Parallels around here to test it. My question is: Is there a way to "simulate" this kind of share on a normal Windows server or client so that I'll be able to debug my application with Visual Studio? What I tried so far: I already tried to edit my HOSTS file to contain an entry like # ... 127.0.0.1 .psf # ... but Windows just seems to not recognize the share at all.

    Read the article

  • Computer Studdering When Transferring Over Network

    - by Nalandial
    This is a really weird problem that I've never even seen before. When I copy to or from my server share, my computer studders terribly and the data transfers very slowly at only around 12MB/s. By studdering I mean the mouse skips around and all my applications respond very slowly; as soon as I cancel the transfer it resolves immediately. I looked at Task Manager and the CPU is only at ~35% with plenty of RAM free. This only started semi-recently; before, I had no problems and the transfer speed maxed out the gigabit connection. I have two hard drives in my computer. When I try transferring files between drives it's fine, but when I copy from the share to either drive or to it from either drive, I get studdering. I'm running Windows 7 x64. Anyone have any idea what's going on? Any help would be much appreciated.

    Read the article

  • Commerce Server 2009 with SharePoint 2010 experiences

    - by rsteckly
    Hi, I'm trying to decide to between using MojoPortal for my organizations CMS or Commerce Server 2009 with SharePoint 2010. We already have SharePoint 2010 for our intranet. In that thinking, perhaps it would make sense to deploy the same technology? We do not have a lot of traffic but do need basic e-commerce functionality. I haven't really found a lot of documentation for Commerce Server 2009. It would have to share the same server with SharePoint 2010. I'm not worried about that because of the low traffic. I'm worried about how difficult it is to install. Is it a nightmare product to install or is it pretty straightforward? Is it unrealistic for it to share a server with SharePoint 2010, even in relatively low traffic? Any experiences with administering MojoPortal? Thanks!

    Read the article

  • Export SSL Cert from IIS and import into GlassFish keystore

    - by Tim H
    What I need: I have an existing SSL certificate installed on IIS 6. On the same machine, I have GlassFish installed and would like to share the same certificate since they both share the same hostname, and they use different ports: IIS uses 443 and GlassFish uses 8181. Why I need it: Reuse existing SSL certs from IIS to GlassFish. I imagine that this is possible. I am able to install an SSL cert into GlassFish's keystore, and then import the same exact cert into IIS. I just want to go the other way - imagine having an SSL cert on IIS being used for months, and now I want to enable SSL on GlassFish. What I have done: Created a keystore with an alias: server.hostname.com Imported intermediate CA certs associated with the existing SSL Cert Imported the existing SSL cert with the same alias: server.hostname.com, but the keytool won’t allow this, as it is not associated: keytool error: java.lang.Exception: Public keys in reply and keystore don't match Why? Using a different alias causes the cert to not be trusted in the CA chain.

    Read the article

  • Sharing storage on Linux and Solaris

    - by devlearn
    I'm looking for a solution in order to share a san mounted volume between several hosts running on Linux (RHEL) and/or Solaris (Sparc). Note that I basically need to share a set of directories containing large binary files that are accessed in random R/W mode. I have the following reqs : keep the data on the SAN suitable i/o performances as the software is pretty demanding on IOPS stick to a shared file system as I can't afford a cluster fs (lack of MDS/OSS infrastructure) compression could be really usefull For now I've found only the following candidates : GFS2 , supports Linux only, no compression VxFS , supports Linux and Solaris, compression supported So if you have some suggestions for this list, I'll really welcome them. Thanks in advance,

    Read the article

  • Drbd Primary/Primary + iSCSI: accessing to different files avoids split brain?

    - by Eddie C.
    I have a question / curiosity about split-brain on a Drbd Primary/Primary configuration. Supposing two nodes (hosts), host1 and host2 configured with Drbd Primary/Primary and two different shares (NFS, CIFS o iSCSI) of a replicated area (saying /drbd) /drbd/file1.data /drbd/file2.data If a pool of client would access only by host1 share reading and wrinting only file1.data and another pool only by host2 share to file2.data, this scenario should avoid split brain situation in case of one node failure or it's just a conjecture? The final purpose is load balance between the two nodes in normal condition and collapsing to one node only in case of failure. Thank you! Eddie

    Read the article

  • Network printer - Print direct or via shared printer on Server?

    - by NickC
    It has occurred to me that a workstation can connect to a printer in two ways: 1). Printing directly to the IP of the printer with the print driver installed locally. 2). Printing to a \Server\Printer1 share with the print queue residing on the server. Question is which way is preferred? I would assume that printing directly to a network printer rather than going through the server would be the most efficient from the point of view of network traffic. On the other hand I guess a server printer share would be easier to manage with the correct driver automatically being downloaded to the workstations. Also what about using GPP (Server2012) to install this printer on the workstations, does that require any specific way?

    Read the article

  • How to copy symlinks to target as normal folders

    - by Marek
    Hi i have a folder with symlinks: marek@marek$ ls -al /usr/share/solr/ razem 36 drwxr-xr-x 5 root root 4096 2010-11-30 08:25 . drwxr-xr-x 358 root root 12288 2010-11-26 12:25 .. drwxr-xr-x 3 root root 4096 2010-11-24 14:29 admin lrwxrwxrwx 1 root root 14 2010-11-24 14:29 conf -> /etc/solr/conf i want to copy it to ~/solrTest but i want to copy files from symlink as well when i try to cp -r /usr/share/solr/ ~/solrTest i will have symlink here: marek@marek$ ls -al ~/solrTest razem 36 drwxr-xr-x 5 root root 4096 2010-11-30 08:25 . drwxr-xr-x 358 root root 12288 2010-11-26 12:25 .. drwxr-xr-x 3 root root 4096 2010-11-24 14:29 admin lrwxrwxrwx 1 root root 14 2010-11-24 14:29 conf -> /etc/solr/conf

    Read the article

  • php.ini use multiple include paths - openbasedir restriction

    - by hfidgen
    I need to allow an include path for a vhost subdomain on Plesk 10. I've edited the PHP PEAR path into /etc/php.ini as I'm happy for it to be globally available: include_path = ".:/usr/share/pear/" This works insofar as PHP is able to see the files in that directory when a script tries to include them, but I'm getting the dreaded openbasedir error: Warning: require_once() [function.require-once]: open_basedir restriction in effect. File(/usr/share/pear/xxxx.php) is not within the allowed path(s): (/var/www/vhosts/xxxx.com/subdomains/test/httpdocs/:/tmp/) Am I right in saying that the subdomain or main domain can have a vhost.conf file in which I can alter the openbasedir allowed paths? I've tried searching out solutions but I'm afraid I can't quite see one yet :)

    Read the article

  • Homegroup doesn't show other computers, but "Network" does

    - by McPherrinM
    I have a desktop and a laptop (both running windows 7) in my Windows 7 homegroup. The desktop created the homegroup, and the laptop joined it. Both share a few folders. On the laptop, I have no problem accessing the Desktop's shared folders via the "Homegroup" sidebar button in Windows Explorer. However, on the Desktop, I get the message No other homegroup computers are currently available in the homegroup screen. However, if I go to the Network page, I can see the other computer and browse its shared media. These shares were made by right clicking and choosing "Share with Homegroup". I can access the media, so this isn't a big problem, but I'm just confused as to why the Homegroup screen denies the existence of the other computer. Has anybody else encountered and resolved this?

    Read the article

  • nginx: php-fastcgi running but php files not executing

    - by Daniel
    I have recently set up a nginx server with PHP running as FastCGI process. The server is running with HTML files however PHP files are downloading instead of displaying and PHP code is not processed. This is what I have in nginx.conf: server { listen 80; server_name pubserver; location ~ \.php$ { root /usr/share/nginx/html; fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME /usr/share/nginx/html$fastcgi_script_name; include fastcgi_params; } } The command netstat -tulpn | grep :9000 displays the following which indicates php-fastcgi is running and listening on port 9000: tcp 0 0 127.0.0.1:9000 0.0.0.0:* LISTEN 2663/php-cgi If it's if any importance my server is running on CentOS 6 and I installed nginx and PHP using the repositories from The Fedora Project.

    Read the article

  • Getting "open_basedir restriction in effect" in spite of adding the correct entry.

    - by akshatc
    I am trying to create a shared hosting scenario, using open_basedir option of php. I am doing this by adding the following to apache2.conf <VirtualHost *:80> ServerName lt1.example.net DocumentRoot /home/akshat/example/tmpblogs/tb1/ php_admin_value open_basedir /home/akshat/example/tmpblogs/tb1/ </VirtualHost> <VirtualHost *:80> ServerName lt2.example.net DocumentRoot /home/akshat/example/tmpblogs/tb2/ php_admin_value open_basedir /home/akshat/example/tmpblogs/tb2/ </VirtualHost> Now when I access lt2.example.net, I get the error: Warning: Unknown: open_basedir restriction in effect. File(/home/akshat/example/tmpblogs/tb2/index.php) is not within the allowed path(s): (0) in Unknown on line 0 Warning: Unknown: failed to open stream: Operation not permitted in Unknown on line 0 Fatal error: Unknown: Failed opening required '/home/akshat/example/tmpblogs/tb2/index.php' (include_path='.:/usr/share/php:/usr/share/pear') in Unknown on line 0 I was getting the same error while accessing "lt1.example.net" too, but then it suddenly became alright. What am I doing wrong here?

    Read the article

  • CentOS: safe to yum reinstall after removing 32-bit packages?

    - by virtualeyes
    as per the CentOS FAQ on removing 32-bit packages present in a 64-bit install, is it safe to perform the last step: You may also want to do this: yum reinstall \* The reason is that sometimes the /usr/share/ items (shared between BOTH packages) get removed when removing the 32-bit RPM packages. on an existing installation? (i.e. where data & settings of possibly affected applications need to be preserved) rpm -Va shows a number of entries like: /sbin/ethtool: at least one of file's dependencies has changed since prelinking S.?..... /sbin/ethtool /usr/libexec/mysqld: at least one of file's dependencies has changed since prelinking S.?..... /usr/libexec/mysqld along with /usr/share entries with T flag (apparently filetime diff, seems safe) The machine is up & running fine, but may not be whenever a reboot occurs. Clue-in as to the real state of the machine (hosed or OK) appreciated Thanks

    Read the article

  • apc.stat causes 500 internal server error

    - by Legit
    When I turn off apc.stat it causes a 500 internal server error. I checked the apache error_log and it's something about: [Tue Jun 26 10:02:59 2012] [error] [client 127.0.0.1] PHP Warning: require(): Filename cannot be empty in /var/www/site1/public/index.php on line 17 [Tue Jun 26 10:02:59 2012] [error] [client 127.0.0.1] PHP Fatal error: require(): Failed opening required '' (include_path='.:/usr/share/pear:/usr/share/php') in /var/www/site1/public/index.php on line 17 I checked that line and here's what it contains: require('./wp-blog-header.php'); I don't see anything wrong with it. Here's my current APC config: APC version: 3.1.10 PHP Version: 5.4.4 How do I resolve this error when i disable apc.stat?

    Read the article

  • Can I mark a folder as mountpoint-only?

    - by Collin
    I have a folder ~/nas which I usually use sshfs to mount a network drive on. Today, I didn't realize the share hadn't been mounted yet, and copied some data into it. It took me a bit to realize that I'd just copied data into my own local drive rather than the network share. Is there some way to mark in the system that this folder is supposed to be a mount point, and to not let anyone copy data into it? I tried the permissions solution here: How to only allow a program to write to a directory if it is mounted?, but if I don't have write access I also can't mount anything to it.

    Read the article

  • Apache SSL losing session over load balancer

    - by SaltyNuts
    I have two physical Apache servers behind a load balancer. The load balancer was supposed to be set up so that a user would always be sent to the same physical server after the first request, to preserve sessions. This worked fine for our web apps until we added SSL to the setup. Now the user can successfully login, see the home page, but clicking on any other internal links logs the user right out. I traced the issue to the fact that while initial authentication is performed by server 1, clicking on internal links leads to having the request sent to server 2. Server 2 does not share sessions with server 1, and the user is kicked out. How can I fix it? Do I need to share sessions between the two servers? If so, could you point me to a good guide for doing this? Thanks.

    Read the article

  • Shared block device file system (cluster file system without networking)

    - by fungs
    Is there any file system that can be mounted multiple times and supports concurrent file access for Linux? Basically I want something like a cluster file system but without the need to have a running network for a distributed lock manager. That can be very handy in connection with virtual machines that can share data with the host or another VM without the need to create a network link. This I want to avoid to keep the network architecture secure (virtual machine in DMZ) but share large files. No need to scale it up, just two machines that mount the same block device. Shouldn't it be possible to have file locking information right on the disk?

    Read the article

  • Microsoft Office "Read-only" warning not appearing on Samba shares on Mac OS X Server

    - by bongo
    Hi, some of my users don't get the "read-only" warning (but "read-only" does appear on the title bar and the document is indeed opened read-only) when opening Office 2007 documents already opened by another user. We run the samba share off an XSan volume under Mac OS X 10.5.8 Server. Strict locking is on but oplocks are off (from Server Admin). At home, with a simple samba share in 10.6.3 server, it works correctly. Any ideas? or is this a 10.5.8 behavior?

    Read the article

  • MBP Bluetooth PAN connection with iPhone 4

    - by Chetan Sachdev
    I am trying to share MBP(OSX 10.8.3) internet connection with iPhone 4 (iOS 6) using Bluetooth PAN(Personal Area Network). Problem is Bluetooth PAN is not getting an ip address. I have tried to renew DHCP lease but nothing works. When I give a Manual IP address, the Bluetooth PAN goes green but doesn't shows the connection in iPhone. Is it possible to share the internet connection via Bluetooth. Note: I don't want to create an Ad-hoc connection over wifi.

    Read the article

  • How to determine which source files are required for an Eclipse run configuration

    - by isme
    When writing code in an Eclipse project, I'm usually quite messy and undisciplined in how I create and organize my classes, at least in the early hacky and experimental stages. In particular, I create more than one class with a main method for testing different ideas that share most of the same classes. If I come up with something like a useful app, I can export it to a runnable jar so I can share it with friends. But this simply packs up the whole project, which can become several megabytes big if I'm relying on large library such as httpclient. Also, if I decide to refactor my lump of code into several projects once I work out what works, and I can't remember which source files are used in a particular run configuration, all I can do it copy the main class to a new project and then keep copying missing types till the new project compiles. Is there a way in Eclipse to determine which classes are actually used in a particular run configuration?

    Read the article

  • How do you sync photos with your family?

    - by romkyns
    We've always been trying to share photos within our family, despite all living in different countries. This has been very challenging. We have about 50GB of photos that we share with each other. Everyone organizes them differently, so rsync/Syncplicity don't work. Everyone likes their photos on the fast hard drive rather than a slow website with reduced quality, so online sharing websites are a no-go. So far we have been essentially syncing them manually, via a shared folder that new photos are placed to and collected from. This is laborious and prone to errors, that usually leave one of us without all the photos without even knowing. To those who are in a similar situation, how do you solve this?

    Read the article

  • VMware ESXi 4 On-Disk Data Deduplication - possible and supported?

    - by hurikhan77
    Environment: We are running multiple web, database, and application servers which usually share a pretty common installation (gentoo linux) and similar configuration in VMware ESXi 4. The differences are usually only some installed features or differing component versions. To create a new server, I usually choose the most similar (by features) running server, rsync a copy of it into freshly mounted filesystems, run grub, reconfigure and reboot. Problem: Over time this duplicates many on-disk data blocks which probably sums up to several 10's of gigabytes. I suppose if I could use a base system as template with the actual machines based on top of that, only writing changed blocks to some sort of "diff image", performance should improve (increased cache hit rate) and storage efficiency should increase (deduplicated storage space). This would be similar to what ESXi already supports for RAM deduplication (page sharing). Question: Is there any way to easily do this on ESXi 4? I already share the portage tree via NFS but this would not work for the rootfs.

    Read the article

  • how to stream audio and video files, but use any media player on Windows (without using Windows file

    - by RamyenHead
    I want to access and play media files on machine S (Windows XP) from machine C (Windows XP). Using Windows File Sharing ("share this folder" stuff), if it works, I would share the folder containing media files on machine S, and I would be able to play media files, sitting in front of C, using any media player I want. Windows somehow ensures that the remote files behave like local files. But Windows file sharing won't work for me, is there any alternative? If two machines were both Linux, I would install an SSH server on S and use Nautilus from C to access and play media files. The reason why I can't use Windows file sharing is, my campus use two different subnets, I have S and C on different subnets and it seems that the firewall governing the whole network in campus doesn't allow file sharing between different subnets. I tried changing Windows Firewall settings on S to allow C in, it still wouldn't work, so it must be the other firewall.

    Read the article

  • Sabnzbd Installed on Linux NAS

    - by Mike Szp.
    I installed SABnzbd on a linux formatted NAS. Now the directory it downloads to is mapped differently on the NAS itself, because the path that SABnzbd knows about starts in it's own folder. If this sounds confusing let me give you an example: \\MYNAS\Volume_1\ That is the path of the drive on the NAS. I would like my SABnzbd downloads to go to: \\MYNAS\Volume_1\Downloads Right now SABnzbd is installed to: \\MYNAS\Volume_1\ffp\opt\optware\share\SABnzbd And the default download directory (as indicated in SABnzbd is): /ffp/opt/optware/share/SABnzbd/downloads/complete I know that the mapping is different somehow because It is installed on the NAS, but I just am lost as to what I should do. So far, I have tried for the complete folder: /192.168.restofip/Volume_1/downloads/complete /Volumes/Volume_1/downloads/complete /Volume_1/downloads/complete Does anyone know how to change the path so that I can have it download to one of the topmost folders on the NAS instead of having it download to a folder so deep in the drive?

    Read the article

< Previous Page | 118 119 120 121 122 123 124 125 126 127 128 129  | Next Page >