Search Results

Search found 12826 results on 514 pages for 'third party controls'.

Page 389/514 | < Previous Page | 385 386 387 388 389 390 391 392 393 394 395 396  | Next Page >

  • Outlook 2007 panes keep moving when changing resolution

    - by SilverbackNet
    This problem is really bugging one of our users ever since he got a larger monitor. Now that the monitor has a different resolution than his laptop, every time he unplugs it to go home, the three Outlook panes get all jumbled up. The navigation is huge, the list is shoved over, and the reading pane is almost smushed out of existence, the the opposite when he comes back in and the reading pane fills the screen. He's sick of adjusting it every day. He always runs it maximized, for maximum reading area. Keeping the application within a 1024x768 window wouldn't really be an option for him. Is there any way built into Outlook to automatically adjust pane sizes when the resolution changes? If not, is there a third-party app that can help, or a way to script the changes into the registry somehow? (I can do running the script whenever the screen state changes.) If this is fixed in 2010 I might be able to convince the other admin that this is a good enough reason to allow it (which will require a new beta version of our archiving software).

    Read the article

  • Sharing music on NAS with Zune and iPod?

    - by osij2is
    After being a long time iPod owner, I'm switching to the new Zune with its subscription model. I haven't bought a Zune yet but I'm planning on doing so within the next month or so. I have approximately 40GB worth of music and my girlfriend has her iPod music library around 30GB. I've been trying to figure out how to migrate all our music off of our laptops/desktops and centralize everything on my NAS. In sharing iPod music isn't too bad. Sharing from one machine to all is fairly easy within the iTunes player. As far as storing all the music on a NAS, again, iPods aren't too bad and imagine other systems aren't difficult. But I'm really new to the Zune and I'm beginning to run into some issues. My questions are: Is it possible to store all music from our iPods and Zune subscriptions and share music between the iPod/Zune within the same file share on my NAS? I'm sure it's possible to store music on a share, but I'm not sure how iTunes and the Zune software differs. Is there 3rd party software, maybe something like DoubleTwist that can sync based from NAS to multiple desktop/laptops? I've never used DoubleTwist but it's something that I found that looks close to being what I need. I've never quite done this myself so I'm trying to find a solution that can: a) store music on a network share; b) sync between different devices (Zune/iPod) seamlessly.

    Read the article

  • Recommended drive encryption solution

    - by Chris Driver
    Hello, I will soon be purchasing a number of laptops running Windows 7 for our mobile staff. Due to the nature of our business I will need drive encryption. Windows BitLocker seems the obvious choice, but it looks like I need to purchase either Windows 7 Enterprise or Ultimate editions to get it. Can anyone offer suggestions on the best course of action: a) Use BitLocker, bite the bullet and pay to upgrade to Enterprise/Ultimate b) Pay for another 3rd party drive encryption product that is cheaper (suggestions appreciated) c) Use a free drive encryption product such as TrueCrypt Ideally I am also interested in 'real world' experience from people who are using drive encryption software and any pitfalls to look out for. Many thanks in advance... UPDATE Decided to go with TrueCrypt for the following reasons: a) The product has a good track record b) I am not managing a large quantity of laptops so integration with Active Directory, Management consoles etc is not a huge benefit c) Although eks did make a good point about Evil Maid (EM) attacks, our data is not that desirable to consider it a major factor d) The cost (free) is a big plus but not the primary motivator The next problem I face is imaging (Acronis/Ghost/..) encrypted drives will not work unless I perform sector-by-sector imaging. That means an 80Gb encrypted partition creates an 80Gb image file :(

    Read the article

  • Restoring a fresh home folder in a shared user domain environment

    - by Cocoabean
    I am using a tool called pGINA that adds another credential provider to my Windows 7 clients so we can authenticate campus users via campus LDAP. We have the default Windows credential providers setup to authenticate off of our Active Directory, but we have students in our classes that don't have entries in our AD, and we need to know who they are to allow them internet access. Once these LDAP users login using pGINA, they are all redirected to the same AD account, a 'kiosk' account with GPOs in place to prevent anything malicious. My concern is that my users will accidentally save personal login information or files in that shared profile, and another user may login later and have access to a previous user's Gmail account, as the AppData folder on each computer is shared by anyone logging into the kiosk user. I've looked into MS's 'roll-your-own' SteadyState but it didn't seem to have what I wanted. I tried to write a PS script to copy a pre-saved clean version of the profile from a network share, but I just kept running into issues with CredSSP delegation and accessing the share from the UNC path. Others have recommended something like DeepFreeze but I'd like to do it without 3rd party tools if possible.

    Read the article

  • Starting old computer - nothing shown on screen at boot

    - by Jonas
    I'm trying to start an about 10 years old PC computer. But nothing is shown on the screen, and it beeps everytime I press a key on the keyboard. I can press Ctrl+Alt+Del to reboot the computer. The monitor is newer and seem to work with other computers. I don't see anything from POST/BIOS at start or later. I have tried to change to another graphic card, but it didn't change anything. What can I do to solve this problem? Update: I have now tried with another computer (the one where the "another graphic card" came from) and I got the same problem. I doesn't show anything on the screen. Both these computers had a GeForce2 MX 400 graphic card. I tried with another computer screen it didn't work - it was showing "signal out of range". So is the graphic card GeForce2 MX 400 too old for newer TFT-monitors? I tried with a third computer so I know that the monitors are working, and both monitors work fine with that computer.

    Read the article

  • Disable internal display on Macbook Pro without closed lid mode?

    - by jslaker
    I have an early 2007 Macbook Pro running 10.5 that I've recently set up on a KVM with my primary desktop system. The problem I've run into is that I have a 20" 1680x1050 LCD, and OS X only provides options to mirror at the resolution of the built-in display or to span. Since the built-in display runs at 1440x900, this leads to running my LCD at non-native res and a fuzzy picture. There isn't any option that I can find to simply disable the built-in display entirely and run the external LCD at its native resolution. I am aware of closed lid mode, but the MBP was disassembled while in storage for about 6 months (took it apart to pull the HDD) and the cable to the touchpad, which controls the sleep sensor was damaged, meaning closed lid mode won't work. I've looked into replacing the cable, but the cheapest I've been able to find it is $75-100, and I'm trying not to invest any more money into this computer as it also has a completely dead battery and a few other minor problems. I've found the app SwitchResX which appears to allow you to do what I need, but it has a lot of functionality I don't need and a ~$20 registration charge attached to it. An odd set of circumstances, I'm aware, but I was hoping somebody might know of an OS hack that would let me just disable the internal display and be done with it. :)

    Read the article

  • NAT via iptables and virtual interface

    - by Alex
    I'm trying to implement the following scenario: One VM-host, multiple guest VMs, each one gets its own IP-address (and domain). Our server has only one physical interface, so the intended use is to add virtual interfaces on eth0. To complicate our situation the provider uses port-security on their switches, so I can't run the guest interfaces in bridged mode, because then the switch detects a "spoofed" MAC-address and kills the interface (permanently, forcing me to call the support, which I'm sure will get them a little bit angry the third time ;) ). My first guess was to use iptables and NAT to forward all packages from one virtual interface to another one, but iptables doesn't seem to like virtual interfaces (at least I can't get it to work properly). So my second guess is to use the source IP of the packages to the public interface. Let's assume libvirt creates a virbr0-network with 192.168.100.0/24 and the guest uses 192.168.100.2 as IP-address. This is what I tried to use: iptables -t nat -I PREROUTING --src public_ip_on_eth0:0 -p tcp --dport 80 -j DNAT --to-destination 192.168.100.2:80 That doesn't give me the intended results either (accessing the server times out). Is there a way to do what I'm trying to do, or even to route all traffic to a certain IP on a virtual interface to the VM's device?

    Read the article

  • Is there any way to synchronize AD users with Office 365 but still be able to edit them online?

    - by Massimo
    I'm performing a migration to Office 365 from a third-party mail server (MDaemon); the local Active Directory doesn't include any Exchange server, and never had any. We will need directory synchronization in order to enable users to log on to Office 365 using their domain credentials; but it seems that as soon as you enable directory synchronization, you can't perform any action anymore on Office 365 users: all changes need to be made on the local Active Directory, and then replicated by the synchronization process. For ordinary users with a single e-mail address and standard features, this is not a big problem; but what about users which need an additional address? What if I need to configure some nonstandard setting, like "hide from address list" or a custom mailbox quota? From what I've gathered, the only supported way to do this, as you can't directly edit Office 365 objects anymore after synchronization is enabled, is to extend the local AD schema with Exchange attributes, and then manually edit them (!). Or, you can install at least one local Exchange server, and then use the Exchange administrative tools to configure the required settings. Is this correct or am I missing something? Is there any way to synchronize user accounts and password, but still be able to edit user settings directly in Office 365? If not (everything really needs to be set locally and then synchronized), is there any simpler way to do this than manually editing LDAP attributes or installing a local Exchange server?

    Read the article

  • How to do a Windows 7 Image restore to an external drive?

    - by Vaccano
    I have a system that I have done a Windows 7 Image restore on. I would like to migrate that image to a different hard drive. Is there a way to restore the image to an externally connected hard drive? For example: I have 3 hard drives: The first in the source machine (the one I want to copy). The second in a machine that I want to do the work. And the third is not in a machine. It is the target that I want to overwrite with the contents of the first. I boot up a 2nd machine and connect the 3rd hard drive externally (using some cool cables I have). I then use some cool feature of Windows 7 to replace what is on the 3rd hard drive with the windows 7 image of my 1st machine (that is on on my networked backup server). I need to know what the above mentioned "cool feature of windows 7" is, if there is one. And how to use it. Any ideas? Note: that I very much so don't want it to overwrite what is on the 2nd machine/hard drive.

    Read the article

  • Why is only one wget command working in my crontab?

    - by afEkenholm
    I wish to fetch content from a PHP script on my server two times a day, altering a query variable lang to set what language we want, and save this content in two language specific files. This is my crontab: */15 * * * * ~root/apache.sh > /var/log/checkapache.log 10 0 * * * wget -O /path/to/file-sv.sql "http://mydomain.com/path/?lang=sv" 11 0 * * * wget -O /path/to/file-en.sql "http://mydomain.com/path/?lang=en" The problem is that only the first wget command line is being executed (or to be precise: the only file that is being written is /path/to/file-sv.sql). If I switch the second and the third row, /path/to/file-en.sql gets written instead. The first line always runs as expected, no matter where it is. I then tried using lynx -dump "http://mydomain.com/path/?lang=xx" > /path/to/file-xx.sql to no avail; still only the first lynx line executed successfully. Even mixing wget and lynx did not change this! Getting kinda desperate! Am I missing something? There are thousands of articles on crontab (combined with) wget or lynx, but all seems to cover basic setups and syntax. Does anyone got a clue of what I am doing wrong? Thanks, Alexander

    Read the article

  • Google Chrome Browser

    - by Harish
    Hi friends. Am using Google Chrome as my default web browser. I don't have any problem with it. The only problem rise is when I enter gmail.com and login into my account. I need to go to Histories in Google chrome (ctrl + shft + del) and select "Del Cokies and Other datas" for entering into gmail again. My gmail page is workin just once. I nedd to log in. Check my mail and I have to clear the cookies in order to log in again If i fail, This is d info I get The webpage at https://mail.google.com/mail/?shva=1&ui=html&zy=l&pli=1&auth=DQAAALgAAABhdI_K9uptgb6yQfGVmnl74VZEUH7U2M7WGJn3kJnCiY0CNI5QBU3X-g6UjPENGoHKSHE9nRna_Ygu_d59mN-HG1SUzNpI_UEMJ9CwDqZAYxYLEJl8r_JA2qJNGF8H0fdKfn99Gb2YeI-lprGxCrWRT7LicyADxQvNLQ6l9xBvOccEBSJfdIrna8dOXeX06N41L0zpnLQrVG1qdulR7LxId9XwtVb6QtfhwnambqLoNiY402Y5pjGG1_gFL4dNpJA&gausr=hariss89%40gmail.com has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer. Here are some suggestions: Reload this web page later. Learn more about this problem. Wat can I do ...

    Read the article

  • Solutions on how to use an OS X calendar as a more perfect time tracking solution for 5-10 users in a small agency?

    - by jnthnclrk
    I really like OS X's iCal. Entering events is easy with the mouse and it also gives you a very real visual sense of how long tasks take to complete. We often work remotely in our organisation, so we use a few shared calendars between key individuals to provide us with an overview of hours worked, availability & schedule conflicts without too much disruption to our various, hectic workflows. It really is a neat solution, especially on shared tasks. How many times have you tasked a remote colleague and then lost the thread on whether that task was completed or not? With shared calendars you get a much clearer idea of what your people are working on without having to pick up the phone or compose a chat. However, there are a few areas where this approach fails... iCloud syncing often needs to be re-jiggered The "view only" option on shared calendars does not seem to work, which makes all shared calendars editable by others There is no decent reporting with this workflow There is no task categorisation or tagging Things get very busy in iCal when working with more than 2 shared calendars I've looked at a few task management apps like Basecamp and Harvest, but nothing appears to let me edit my calendar natively and then sync with a 3rd party. Interested in solutions to improve the above workflow and enable us to elegantly increase the amount of users.

    Read the article

  • Mysql server high trafic makes websites really slow or unable to load

    - by Holapress
    Lately we have been having a lot of problems with our mysql server, from websites being really slow or even unable to load them at all. The server is a dedicated server that only runs our mysql database. i have been running some test using a profiler (JetProfiler) and tool to stress test (loadUI). If I use loadUI to connect with 50 simultaneous connections to one of our websites that runs a resently big query it will already make the website be unable to load. One of the things that makes me worried is that when I look at Jetprofile it always shows a Treads_connected of 1.00 and it seems that when it hits around 2.00 that I'm unable to connect. The 3 big peaks are when I run a test with loadUI, first one was 15 simultaneous connections wich made it still able for me to load the website but just really slow, the second one was 40 simultaneous connections which already made it impossible to load and the third one was with 100 connection which also didn't make it load anymore. Another thing that worries me is that in JetProfiler it says all the queries that get used are full table scans, could this maybe be the problem? The website I run as a test runs 3 queries, one for a menu that outputs around 1000 rows, one for the adds that has around 560 rows and a big one to get posts that has around 7000 rows (see screenshot bellow) I also have monitored the cpu of the server and there seems to be no problem there, even when I make a lot of connections with loadui the cpu stays low. I can't seem to figure out what is the main cause of the websites being unable to load when there is a high amount of traffic, if anyone has other suggestions for testing or something that might cause the problem please let me know.

    Read the article

  • Strategy for Incremental Datasource fetchings in Excel

    - by user1352530
    I am in an scenario with a table that is refresh by a third app every week. I need to keep accumulating all data in Excel, using an ODBC connection to the database. I am wondering Approach 1: Is there a way to force Excel to append results for every update (this update would be triggered according to a parameter that indicates week)? I tried to define the table for which the connection loads using a dynamic reference but once is anchored first time, table position is never redefined Approach 2: Use an ETL to accumulate all weekly results into a staging table and then connect Excel to it in real time. But, I would need a mechanism for caching old data, as I cannot grow exponentially the time Excel opens. Imagine after 10 years, Excel would need to update at opening 10 years fo data before showing it. Is there a way to store already fetched data and increment it at real time (when book is opened) by selecting new data (with a query/filter of something) Thanks EDIT: Maybe it's better to ask it that way: What is the optimal strategy for a table that keeps growing and needs to be read in real time by Excel? I just don't want to fetch absolutely all data after some months...

    Read the article

  • Hybrid Exchange Online setup with on premise public folders, certificate issues?

    - by exxoid
    We have a Hybrid Exchange setup with Exchange Online (v15 tenant) and Exchange 2010 on premise. The hybrid configuration for the most part is working, what I am having an issue with is getting public folders to work for cloud users. I followed the official documentation here (http://technet.microsoft.com/en-us/library/dn249373(v=exchg.150).aspx) and it kind of works. When I am accessing Outlook on a public wifi I am able to bring up the cloud mailboxes and on premise public folders show up in Outlook. When I am accessing email via Outlook as a cloud user on the same LAN as the on premise exchange, the cloud user makes the outlook.com connection for live/ad/archive mailbox but fails to create a proxy connection for the on premise public folders. The error I get is a certificate mismatch, it seems that when a user on the LAN accesses Outlook/Exchange it is using a different certificate vs. when Outlook is launched on a WiFi network. When I look at the Outlook connection information, I see the connection to outlook.com for ad/live/archive mailbox but no entry for public folder connection. Our on premise Exchange is 2010 SP3 with latest CUs. The client is a domain joined laptop with Windows 7 and Office 2010 SP2, latest windows updates applied. Our infrastructure has a working ADFS 3 and DirSync setup for Office 365. My question then is, what do I need to do to make sure that the Cloud user launching Outlook on the LAN uses the proper certificate (the wildcard 3rd party cert.. vs. the self signed certificate which it looks like it may be using during the connection attempt).

    Read the article

  • Understanding Unix Permissions (w/ ACL)

    - by Dr. DOT
    I am trying to set permissions on my server properly. Currently I have a number of directories and files chmod'd at 0777 -- but I am not comfortable with it being this way. So at the advice of a serverfault specialist, I had my hosting provider install ACL on my shared virtual server. When I FTP to the server as my FTP user account "abc", I can do everything I need to do (and rightfully so) because all my dirs and files are owned by "abc", the group is "abc", and the 1st octet is set to 7 (rwx). That much I get. But here's where it gets dark gray for me. PHP is set to user "nobody". so when someone browses on of my web pages that either ends in .php or has some embedded PHP, I assume the last octet controls the access. Because all my dirs and files are owned by "abc" and assigned to group "abc", if the last octet was a 4 (r--) then the server would let the browser read the file. If it were a 6 (rw-) then the server would let the browser also write to the file or directory, correct? what if the web document does not end in .php or does not have any PHP embedded? What is the user then? how can I use ACL to not set the permission to 6 (rw-) or even 7 (rwx)? [not sure what execute does or means] Just looking for some sort of policy settings to best lock down my dirs and files while allowing my PHP scripts to do uploads and write to files (so my users don't call me to tell me "permission denied". Ok, thanks to anyone out there willing to lend me a hand. It is greatly appreciated.

    Read the article

  • What to do about old MacBook

    - by John
    I have a White MacBook version 10.5.8. I accidentally dropped it and about an inch of the most right side of the screen is blacked out. I can't see the clock and Trash Can because of this. I checked how much it would cost to repair, and the cost of repairing is a flat rate of $300. I thought that was kind of expensive, like a third of the cost of the MacBook itself. accidentally dropping a notebook is something that would happen sooner or later, so I think getting a Lenovo is the best. I ended up getting a MacBook pro 15 inch, but do you think it's worth the money to fix my old MacBook? I even saw some used MacBooks available on Ebay for about $400. Anyways I don't know what to do with it. I do like its screen and keyboard better than the MacBook Pro ones, because the screen is less glare-y and the keyboard is more crisp and fun to type on, and its more wieldy when I'm using it while lying down.

    Read the article

  • Connecting to a subdomain severs the connection to the domain itself. What's going on?

    - by TheAgent
    Hi all. We have a website on a third-party server (server leased and shared with other websites) and the server provides access to our SQL Server database through a subdomain in the form of mssql.DomainName.com. I was told to use SQL Management Studio Express to connect to this subdomain in order to manage the database. After a few tries and getting many "Timeout" messages, I finally manage to connect to the server; everything's fine. But now I can't connect to DomainName.com anymore. Trying to browse DomainName.com using Firefox, it tries to "lookup" DomainName.com address and fails, telling me "the server was not found". I have to disconnect Management Studio from the server and wait a couple of hour for DomainName.com to become available again, and after that, trying to reconnect to the SQL Server again repeats the scenario. While I can't browse DomainName.com directly, I can use a proxy to connect to it, meaning that the problem is somehow related to a DNS my computer tries to ask to translate the name to the corresponding IP. Anyone seen anything like this before? Any ideas? Thanks in advance.

    Read the article

  • Managing SharePoint permissions via Active Directory?

    - by rgmatthes
    My company has thousands of employees organized thoroughly via Active Directory. I have confidence in the accuracy of the Department and Title information displayed in the user profiles. I'm helping to put up a brand new SharePoint 2007 site, and I contacted IT about managing the site's permissions through AD Groups. The goal is to have the site automatically assign read/write/contribute/whatever permissions based on the information in AD. For example, we could create an AD Group called "Managers" that would contain anyone with the "Manager" title in their AD user profile. I would have SharePoint tap into this AD Group to mass assign permissions if I knew all managers would need a certain level of access (read/write/contribute/whatever). Then if a manager joins the company or leaves it, the group is automatically updated (provided AD gets updated, of course). My IT rep called back and said it couldn't be done. This seems like a pretty straightforward business requirement, and one of the huge benefits of having Active Directory, but maybe I'm mistaken. Could anyone shed some light on this? A) Is it possible to use dynamically-updated AD Groups when assigning permissions via SharePoint? (Does anyone know of a guide I could show my doubtful IT rep?) B) Is there a "best practice" way to go about this? I've read some debate on whether SharePoint Groups or AD Groups are the way to go. My main concern is dynamic updating. C) If this isn't available out of the box, can someone recommend third-party software that will provide the functionality I'm looking for? A big thanks to anyone who can help me out!!

    Read the article

  • Is there a way to log commands that a user runs in Windows 7?

    - by camster342
    I manage a large enterprise environment, and while we try to advise users not to, there are inevitably users that need to have local admin access to their machines. The problem is that some of these users like to "fiddle" and sometimes screw up their machines in "wonderful" ways. Is there an easy way to log what a user does on a machine, specifically in the command prompt? Maybe there is 3rd party tools I could use to log this information? With Linux that I used to use in past ages, you could look at a users bash history file to see what commands they have run. While I realise that specific log could also be altered by the user if they wanted to cover their tracks, that is the sort of log I'm looking for. If there are other ways I can also log what other system configuration type changes they make as well (not necessarily command line based), that's also useful. I know about event/system logs and so on, but that doesn't necessarily catch all the information I need to figure out how the user has buggered their machine this time.

    Read the article

  • Ubuntu: Memory Leak

    - by Keener
    I'm having trouble finding from where this memory leak is occurring. I'm running Ubuntu 8.04 LTS on a Dell XPS M1530. I have 3GB of ram and I'm finding after about an hour or so of use top shows me 2GBs+ used. The strange thing is when I add up the memory percentages by PID either from top or ps aux I find that I should only be using about 20-25% of my available ram. What brought this to my attention was I've begun running vmware server again. Now, obviously the ram usage spikes when I load a virtual machine, but the memory VMware is using does not account for the memory usage I'm seeing via top or free. Stopping vmware server releases the memory which was allocated to it, but I'm still unable to find where this RAM is being used. After a complete reboot, of course, the memory is fine, but very quickly it climbs to 60-80% usage with the processes only appearing to account for a third of that. Any ideas where I should look for more information on what this could be?

    Read the article

  • OS X Application icons missing after filesystem tampering

    - by dylan
    A while back I installed some Google software, I forget what it was, but it was something that attached itself to Google Chrome and was un-uninstallable short of uninstalling all of Chrome. It was a total pain in the a$$. Anyways, I was trying to put up a fight before going nuclear and just wiping Chrome, and during that fight, something went wrong (i.e., I didn't know what I was doing). I remember messing around with the Applications folder somehow. I don't remember what exactly I did, but it perhaps involved some interchanging between the /Applications and /User/Username/Applications folders? Definitely some tampering in those areas, I'm sure at least about that. Anyways, I've since updated and restarted my computer. Now, while all my apps work fine, many of them have blank icons - not on the Dock, those icons are fine, but in Spotlight search results, etc. Currently, my /Applications folder contains only OS X-cooked-in applications. My ~/Applications folder contains cooked in apps AND post-OS/third-party apps. There hasn't been any functionality deficit so far, but I don't like working on a machine where something isn't how it should be. I know my information was vague, but does anyone know what went wrong / how to fix it? OS X Mavericks 10.9.3

    Read the article

  • Synchronize folder on network, preserving hard links

    - by Waleed Hamra
    I have few computers using Windows XP Pro. I want to synchronize/back a folder from one machine, to another one. This far, It's a simple problem, and I've used FreeFileSync for such operations, with very satisfactory results. But, this all changes when hard links come into play. Today's folder contains lots of hard links, using such backup programs will result in hard links being treated as multiple files, and copied as such, greatly increasing folder size on destination, and defeating the purpose of using all these hard links in the first place. It gets more complicated when we take into consideration the fact that network shares on Windows DON'T expose hard linking facilities, meaning that running a hard-link-aware tool like rsync using --hard-links will be of no use. So my question, how can i backup my folder to the other computer, while preserving hard links? I don't mind installing 3rd party tools to do it, as obviously, the standard windows shares approach won't work... I am guessing there might be some tool that can be installed on both machines and works in a server/client mode? anyone has any idea how to do this?

    Read the article

  • How to perform this Windows 7 permissions change on many files via GUI or command line

    - by hippietrail
    After using my external hard drive on another Windows 7 computer to tweak photos with Windows Live Photo Gallery then upload them to Facebook I found the modified images were now not visible on the original Windows 7 computer. I'm not sure if the things I tried to get it working subsequently changed anything, but I do know this is the sequence of actions that makes the permissions of the modified files match those of the unmodified files: Right click on broken image file, select "Properties" On the "Security" tab press the "Advanced" button In the "Permissions" tab press the "Continue" button with the shield icon on it Tick the box marked "Include inheritable permissions from this object's parent Click the "Remove" button to remove the only current entry "Type: Allow, Name: Administrators (XYZ\Administrators), Permission: Full control, Inherited From: OK on the "Permissions" tab. OK on the "Security" tab. Now this same procedure does not work at the folder level. It results in "access denied" dialogs. I'm looking for some way to perform this exact modification on all the images I edited on the other computer. I'm happy to use the Windows GUI in Explorer or any other included tools. I'm happy to use the Windows command line. I'd prefer not to use a third-party tool since I'd have to be satisfied it's not doing anything else. I'm not looking for a different way to change permissions to other settings to make an external drive full of photos editable on multiple computers. At least not in this question.

    Read the article

  • Apache RewriteEngine, redirect sub-directory to another script

    - by Niklas R
    I've been trying to achieve this since about 1.5 hours now. I want to have the following transformations when requesting sites on my website: homepage.com/ => index.php homepage.com/archive => index.php?archive homepage.com/archive/site-01 => index.php?archive/site-01 homepage.com/files/css/main.css => requestfile.php?css/main.css The first three transformations can be done by using the following: RewriteEngine on RewriteRule ^/?$ index.php RewriteRule ^/?(.*)$ index.php?$1 However, I'm stuck at the point where all requests to the files subdirectory should be redirected to requestfile.php. This is one of the tries I've done: RewriteEngine on RewriteRule ^/?$ index.php RewriteRule ^/?files/(.+)$ requestfile.php?$1 RewriteRule ^/?(.*)$ index.php?$1 But that does not work. I've also tried to put [L] after the third line, but that didn't help as I'm using this configuration in .htaccess and sub-requests will transform that URL again, etc. I fuzzed with the RewriteCond command but I couldn't get it to work. How needs the configuration to look like to achieve what I desire?

    Read the article

< Previous Page | 385 386 387 388 389 390 391 392 393 394 395 396  | Next Page >