Search Results

Search found 21662 results on 867 pages for 'may'.

Page 92/867 | < Previous Page | 88 89 90 91 92 93 94 95 96 97 98 99  | Next Page >

  • Essbase Excel Add in - S.o.D.

    - by THE
    #cross { font-size: 72pt; } sadly another long lasting friend is about to be buried in the wet, cold data void that holds past programs (... and AOL CDs). The Essbase Excel Add In is about to be de-continued (see  Doc ID 1466700.1) in January '13. The (already out) version 11.1.2.2.x of the Excel Add In must be considered the last release of this particular program (Unless the guys from Applied OLAP bring out their own version next to the openOffice Add In that they already sport). As expected, SmartView achieved parity in functionality with Release 11.1.2.1.102 and ever since then it was just a question of time when our old buddy would get the shoe. For all users out there like me that have known and worked with the Excel Add In for the last decade(s) this is a loss. SmartView may have functionality parity, and may altogether be the stronger, open technology - capable of Planning forms, connection to HFM etc. .But (from my personal point of view) it will not give the end user the same direct access to his databases, with nothing between him and his Essbase Server. Of course it was to be expected that only one of the two could survive and it was obvious that this would be SmartView, so this does not come as a surprise. Still.A minute for an old friend . . . . . . Thank you, and let us look forward! Unless you had other plans for the upcoming season, why not spend it investigating SmartView for your Essbase interaction needs. We hear that the days between Christmas and new year hold unlimited potential to test out new things. Or take it as a new year resolution: "I will switch to SmartView at the earliest possible moment".

    Read the article

  • Backing up data stored on Amazon S3

    - by Fiver
    I have an EC2 instance running a web server that stores users' uploaded files to S3. The files are written once and never change, but are retrieved occasionally by the users. We will likely accumulate somewhere around 200-500GB of data per year. We would like to ensure this data is safe, particularly from accidental deletions and would like to be able to restore files that were deleted regardless of the reason. I have read about the versioning feature for S3 buckets, but I cannot seem to find if recovery is possible for files with no modification history. See the AWS docs here on versioning: http://docs.aws.amazon.com/AmazonS3/latest/dev/ObjectVersioning.html In those examples, they don't show the scenario where data is uploaded, but never modified, and then deleted. Are files deleted in this scenario recoverable? Then, we thought we may just backup the S3 files to Glacier using object lifecycle management: http://docs.aws.amazon.com/AmazonS3/latest/dev/object-lifecycle-mgmt.html But, it seems this will not work for us, as the file object is not copied to Glacier but moved to Glacier (more accurately it seems it is an object attribute that is changed, but anyway...). So it seems there is no direct way to backup S3 data, and transferring the data from S3 to local servers may be time-consuming and may incur significant transfer costs over time. Finally, we thought we would create a new bucket every month to serve as a monthly full backup, and copy the original bucket's data to the new one on Day 1. Then using something like duplicity (http://duplicity.nongnu.org/) we would synchronize the backup bucket every night. At the end of the month we would put the backup bucket's contents in Glacier storage, and create a new backup bucket using a new, current copy of the original bucket...and repeat this process. This seems like it would work and minimize the storage / transfer costs, but I'm not sure if duplicity allows bucket-to-bucket transfers directly without bringing data down to the controlling client first. So, I guess there are a couple questions here. First, does S3 versioning allow recovery of files that were never modified? Is there some way to "copy" files from S3 to Glacier that I have missed? Can duplicity or any other tool transfer files between S3 buckets directly to avoid transfer costs? Finally, am I way off the mark in my approach to backing up S3 data? Thanks in advance for any insight you could provide!

    Read the article

  • Unable to print login-required images in IE

    - by Tim Fountain
    I have some images in a section of a site that require the user to be logged in in order to view. These images are served by a PHP script, which checks the user's login state and if valid, serves the binary data with the appropriate headers. This all works fine. The issue comes when a user tries to print one of these images. In Internet Explorer, when they go to print preview they get the broken image box with a red cross in the corner instead of the actual file. This is what gets printed also. All other browsers can print the images without issue. I have some images elsewhere on the site that are also served via. PHP but these don't require a login. These print fine. The PHP-powered HTML pages on the site that require a login also print fine in IE. It's just login-required images. The user hitting print preview does not seem to result in additional HTTP request to the server for the file. However I do see an additional HTTP request a few seconds later that comes from the same IP (may or may not be related), This request includes no host header, no REQUEST_URI and no user agent. The 'please login' page sends an appropriate 403 header. I've also added a far-in-future expires header to the image response itself to ensure that browsers can serve/print the files from their own cache but this hasn't made any difference. Why can't IE print the images and what else can I do to investigate or fix the problem?

    Read the article

  • How to build the mainline kernel source package?

    - by Maxime R.
    Ubuntu kernel PPA only provides linux-headers*.deb and linux-image*.deb packages. How can I build the corresponding linux-source*.deb package ? Context: I'm currently running Ubuntu 11.10 with the mainline kernel (3.2 rc6 now) to get a better support for my sandybridge IGP (Dell E6420 laptop with intel i5-2520M CPU). Appears, i'd like to install this touchpad driver, ALPS touchpads being badly supported (see previous link bug report), while waiting for upstream support in kernel version 3.3. Problem is, DKMS keeps complaining about not finding the full kernel source: Module build for the currently running kernel was skipped since the kernel source for this kernel does not seem to be installed. Appears I may not need the full source but I'd still like to try having it installed to see if it solve my problem. What I tried : Uncompressing the kernel.org source archive in /usr/src/. DKMS still complaining. Manually updating the kernel source package with uupdate and the mainline source package like explained here. Did not succeed. Manually building the linux-source package following @roadmr and @elmicha instructions. I eventually succeeded to build it but DKMS still complained about the missing source. At last I noticed an error I did not catch in the first place while reinstalling the kernel headers. Appears the .deb I got may have been corrupted, downloading it again did the trick :) Alas, while DKMS agreed to compile the module i ran into the following error which appears to have already been reported. This issue isn't yet solved but I won't try to because of the following: in the end I decided to test the precise kernel version 3.2-rc6 through the xorg-edgers ppa which appears to be correctly patched: it works. Nevertheless, it might still be of some interest to know how to build the mainline linux-source package as the Ubuntu Kernel Team doesn't provide it. Not to mention that I learned a lot in the process ^^

    Read the article

  • Is these company terms good for a programmer or should I move?

    - by o_O
    Here are some of the terms and conditions set forward by my employer. Does these make sense for a job like programming? No freelancing in any way even in your free time outside company work hours (may be okay. May be they wanted their employees to be fully concentrating on their full time job. Also they don't want their employees to do similar work for a competing client. Completely rational in that sense). - So sort of agreed. Any thing you develop like ideas, design, code etc while I'm employed there, makes them the owner of that. Seriously? Don't you think that its bad (for me)? If I'm to develop something in my free time (by cutting down sleep and hard working), outside the company time and resource, is that claim rational? I heard that Steve Wozniak had such a contract while he was working at HP. But that sort of hardware design and also those companies pay well, when compared to the peanuts I get. No other kind of works allowed. Means no open source stuffs. Fully dedicated to being a puppet for the employer, though the working environment is sort of okay. According to my assessment this place would score a 10/12 in Joel's test. So are these terms okay especially considering the fact that I'm underpaid with peanuts?

    Read the article

  • Sucking Less Every Year?

    - by AdityaGameProgrammer
    Sucking Less Every Year -Jeff Atwood I had come across this insightful article.Quoting directly from the post I've often thought that sucking less every year is how humble programmers improve. You should be unhappy with code you wrote a year ago. If you aren't, that means either A) you haven't learned anything in a year, B) your code can't be improved, or C) you never revisit old code. All of these are the kiss of death for software developers. How often does this happen or not happen to you? How long before you see an actual improvement in your coding ? month, year? Do you ever revisit Your old code? How often does your old code plague you? or how often do you have to deal with your technical debt. It is definitely very painful to fix old bugs n dirty code that we may have done to quickly meet a deadline and those quick fixes ,some cases we may have to rewrite most of the application/code. No arguments about that. Some of the developers i had come across argued that they were already at the evolved stage where their coding doesn't need improvement or cant get improved anymore. Does this happen? If so how many years into coding on a particular language does one expect this to happen? Related: Ever look back at some of your old code and grimace in pain? Star Wars Moment in Code "Luke! I am your code!" "No! Impossible! It can't be!"

    Read the article

  • There is No Scrum without Agile

    - by John K. Hines
    It's been interesting for me to dive a little deeper into Scrum after realizing how fragile its adoption can be.  I've been particularly impressed with James Shore's essay "Kaizen and Kaikaku" and the Net Objectives post "There are Better Alternatives to Scrum" by Alan Shalloway.  The bottom line: You can't execute Scrum well without being Agile. Personally, I'm the rare developer who has an interest in project management.  I think the methodology to deliver software is interesting, and that there are many roles whose job exists to make software development easier.  As a project lead I've seen Scrum deliver for disciplined, highly motivated teams with solid engineering practices.  It definitely made my job an order of magnitude easier.  As a developer I've experienced huge rewards from having a well-defined pipeline of tasks that were consistently delivered with high quality in short iterations.  In both of these cases Scrum was an addition to a fundamentally solid process and a huge benefit to the team. The question I'm now facing is how Scrum fits into organizations withot solid engineering practices.  The trend that concerns me is one of Scrum being mandated as the single development process across teams where it may not apply.  And we have to realize that Scurm itself isn't even a development process.  This is what worries me the most - the assumption that Scrum on its own increases developer efficiency when it is essentially an exercise in project management. Jim's essay quotes Tobias Mayer writing, "Scrum is a framework for surfacing organizational dysfunction."  I'm unsure whether a Vice President of Software Development wants to hear that, reality nonwithstanding.  Our Scrum adoption has surfaced a great deal of dysfunction, but I feel the original assumption was that we would experience increased efficiency.  It's starting to feel like a blended approach - Agile/XP techniques for developers, Scrum for project managers - may be a better fit.  Or at least, a better way of framing the conversation. The blended approach. Technorati tags: Agile Scrum

    Read the article

  • Basic OpenVPN setup not working

    - by WalterJ89
    I am attempting to connect 2 win7 (x64+ x32) computers (there will be 4 in total) using OpenVPN. Right now they are on the same network but the intention is to be able to access the client remotely regardless of its location. The Problem I am having is I am unable to ping or tracert between the two computers. They seem to be on different subnets even though I have the mask set to 255.255.255.0. The server ends up as 10.8.0.1 255.255.255.252 and the client 10.8.0.6 255.255.255.252. And a third ends up as 10.8.0.10. I don't know if this a Windows 7 problem or something I have wrong in my config. Its a very simple set up, I'm not connecting two LANs. this is the server config (removed all the extra lines because it was too ugly) port 1194 proto udp dev tun ca keys/ca.crt cert keys/server.crt key keys/server.key # This file should be kept secret dh keys/dh1024.pem server 10.8.0.0 255.255.255.0 ifconfig-pool-persist ipp.txt client-to-client duplicate-cn keepalive 10 120 comp-lzo persist-key persist-tun status openvpn-status.log verb 6 this is the client config client dev tun proto udp remote thisdomainis.random.com 1194 resolv-retry infinite nobind persist-key persist-tun ca keys/ca.crt cert keys/client.crt key keys/client.key ns-cert-type server comp-lzo verb 6 Is there anything I missed in this? keys are all correct and the vpn's connect fine, its just the subnet or route issue. Thank You EDIT it seems on the server the openvpn-status.log has the routes for the client SERVER OpenVPN CLIENT LIST Updated,Wed May 19 18:26:32 2010 Common Name,Real Address,Bytes Received,Bytes Sent,Connected Since client,192.168.10.102:50517,19157,20208,Wed May 19 17:38:25 2010 ROUTING TABLE Virtual Address,Common Name,Real Address,Last Ref 10.8.0.6,client,192.168.10.102:50517,Wed May 19 17:38:56 2010 GLOBAL STATS Max bcast/mcast queue length,0 END Also this is from the client.log file: Which seems to be correct C:\WINDOWS\system32\route.exe ADD 10.8.0.0 MASK 255.255.255.0 10.8.0.5 Another EDIT 'route print' on the server shows the route: Destination Mask Gateway Interface 10.8.0.0 255.255.255.0 10.8.0.2 10.8.0.1 the same on the client shows 10.8.0.0 255.255.255.0 10.8.0.5 10.8.0.6 So the routes are there.. what can the problem be? Is there anything wrong with my configs? Why would OpenVPN be having problems communicating?

    Read the article

  • Customer retention - why most companies have it wrong

    - by Michel Adar
    At least in the US market it is quite common for service companies to offer an initially discounted price to new customers. While this may attract new customers and robe customers from competitors, it is my argument that it is a bad strategy for the company. This strategy gives an incentive to change companies and a disincentive to stay with the company. From the point of view of the customer, after 6 months of being a customer the company rewards the loyalty by raising the price. A better strategy would be to reward customers for staying with the company. For example, by lowering the cost by 5% every year (compound discount so it does never get to zero). This is a very rational thing to do for the company. Acquiring new customers and setting up their service is expensive, new customers also tend to use more of the common resources like customer service channels. It is probably true for most companies that the cost of providing service to a customer of 10 years is lower than providing the same service in the first year of a customer's tenure. It is only logical to pass these savings to the customer. From the customer point of view, the competition would have to offer something very attractive, whether in terms of price or service, in order for the customer to switch. Such a policy would give an advantage to the first mover, but would probably force the competitors to follow suit. Overall, I would expect that this would reduce the mobility in the market, increase loyalty, increase the investment of companies in loyal customers and ultimately, increase competition for providing a better service. Competitors may even try to break the scheme by offering customers the porting of their tenure, but that would not work that well because it would disenchant existing customers and would be costly, assuming that it is costlier to serve a customer through installation and first year. What do you think? Is this better than using "save offers" to retain flip-floppers?

    Read the article

  • install libreoffice in Ubuntu 12.04 is impossible

    - by user1587239
    What is wrong with Ubuntu repositories? sudo apt-get install libreoffice Reading package lists... Done Building dependency tree Reading state information... Done Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: libreoffice : Depends: libreoffice-core (= 1:3.5.4-0ubuntu1.1) but it is not going to be installed Depends: libreoffice-writer but it is not going to be installed Depends: libreoffice-calc but it is not going to be installed Depends: libreoffice-impress but it is not going to be installed Depends: libreoffice-draw but it is not going to be installed Depends: libreoffice-math but it is not going to be installed Depends: libreoffice-base but it is not going to be installed Depends: libreoffice-filter-mobiledev but it is not going to be installed Depends: libreoffice-java-common (>= 1:3.5.4~) but it is not going to be installed Recommends: libreoffice-gnome but it is not going to be installed or libreoffice-kde but it is not going to be installed E: Unable to correct problems, you have held broken packages.

    Read the article

  • Topeka Dot Net User Group (DNUG) Meeting &ndash; April 6, 2010

    - by Robz / Fervent Coder
    Topeka DNUG is free for anyone to attend! Mark your calendars now! SPEAKER: Troy Tuttle is a self-described pragmatic agilist, and Kanban practitioner, with more than a decade of experience in delivering software in the finance and health industries and as a consultant. He advocates teams improve their performance through pursuit of better practices like continuous integration and automated testing. Troy is the founder of the Kansas City Limited WIP Society and is a speaker at local area groups on team related topics. He currently works as a Project Lead Consultant with AdventureTech Group of Kansas City, KS. TOPIC: Why Kanban? Kanban is receiving a large amount of attention recently. What does it offer compared to other approaches? Answering that question may require you to hit the “reset” button on previously held biases and assumptions. Kanban blends Lean thought with ideas from first generation agile methodologies. To get started with Kanban, we will examine what steps are necessary to establish a transparent, work-limited, pull system. We will highlight the perils of allowing too much work-in-progress and how it affects development performance. Once established, Kanban teams need only a few metrics and tools to monitor their performance and improvement. WHERE: Federal Home Loan Bank Topeka on the Security Benefit Campus – Directions? WHEN: 11:30 AM - 1:00 PM on April 6th, 2010 REGISTER: http://topekadotnet.wufoo.com/forms/topeka-dnug-meeting-attendance/ ADDITIONAL INFO: As always, please sign in and out of FHLBank to help them with their accountability. Please park in the visitors section at the front of the building when you arrive. If  there are no spots in visitors you may park in the overflow lot at the far east end of the facility.  Lunch will be provided and we will have some great door prizes!

    Read the article

  • Multi-Resolution Mobile Development

    - by user2186302
    I'm about to start development on my first game for mobile phone (I already have a flash prototype completed so it's jsut a matter of "porting" it to mobile and fixing up the code) and plan on hopefully being able to get the game working on iphones and most android devices. I am using Haxe along with OpenFL and HaxeFlixel for development. My question is: What resolution should I design the game in initially and/or what is the best way to develop a game for multiple resolutions. I have found multiple different methods, the best, in my opinion, being strategy 3 on this page: http://wiki.starling-framework.org/manual/multi-resolution_development. However I have some questions about this. First, what would the best base resolution to use be, the guide suggests 240*320 which seems alright to me, although if I chose to use pixel graphics as I most probably will given I'm using HaxeFlixel, I'm not sure if they'll look too blocky on larger screens which I'm not even sure is a problem as it might still look alright. (Honestly, not sure about that and if anyone has any examples of games that use this method and look nice). Finally, please feel free to share whatever methods you use and think is best. For example, HaxeFlixel has a scaling feature that scales the game to fit the exact screen size, but I'm afraid that would lead to blurry and improperly scaled graphics since it would scale by non-integers. But, I'm not sure how noticeable a problem that may or may not be. Although from experience I'm pretty sure it won't look nice and currently I do not think I'm going to go for this option. So, I would really appreciate any help on this subject. Thank you in advance.

    Read the article

  • any online service and/or application to develop a story line for an adventure game?

    - by Gajet
    I with a bunch of friend were talking about an adventure game. there will be too many possibilities in the game and the player can pick from wide varity of choices at each stage to do somthing. there will be consequences for each decision and they may or may not end the story. the result would be somthing like (picture from flashforward series S01E17)or if any of you watched hereos season 1 there is also similar time lines represented as strings in isaac mandez workshop. sorry for bad quality examples but right now I can't think of any better one. do you know any website or application which we can use to create the timeline? these features the least required ones: the ability to represent events as boxes. the ability to connect distant events to each other. the ability to move events on a scene freely the ability to expand the scene easily there should be some color options for the lines representing connections between events easily shareing the idea with one another it's much more better to have a WYSIWYG editor easily explore in the large scene of events in the end if you know any application which could let me create a board just like the one in my sample picture and share it whith other freinds it could help us a lot.

    Read the article

  • Strange and erratic transformations when using OpenGL VBOs to render scene

    - by janoside
    I have an existing iOS game with fairly simple scenes (all textured quads) and I'm using Apple's "Texture2D" class. I'm trying to convert this class to use VBOs since the vertices of my objects basically never change so I may as well not re-create them for every object every frame. I have the scene rendering using VBOs but the sizes and orientations of all rendered objects are strange and erratic - though locations seem generally correct. I've been toying with this code for a few days now, and I've found something odd: if I re-create all of my VBOs each frame, everything looks correct, even though I'm almost certain my vertices are not changing. Other notes I'm basing my work on this tutorial, and therefore am also using "IBOs" I create my buffers before rendering begins My buffers include vertex and texture data I'm using OpenGL ES 1.1 Fearing some strange effect of the current matrix GL state at the time of buffer creation I've also tried wrapping my buffer-setup code in a "pushMatrix-loadIdentity-popMatrix" block which (as expected) had no effect I'm aware that various articles have been published demonstrating that VBOs may not help performance, but I want to understand this problem and at least have the option to use them. I realize this is a shot in the dark, but has anyone else experienced this type of strange behavior? What might I be doing to result in this behavior? It's rather difficult for me to isolate the problem since I'm working in an existing, moderately complex project, so suggestions about how to approach the problem are also quite welcome.

    Read the article

  • For a Javascript library, what is the best or standard way to support extensibility

    - by Michael Best
    Specifically, I want to support "plugins" that modify the behavior of parts of the library. I couldn't find much information on the web about this subject. But here are my ideas for how a library could be extensible. The library exports an object with both public and "protected" functions. A plugin can replace any of those functions, thus modifying the library's behavior. Advantages of this method are that it's simple and that the plugin's functions can have full access to the library's "protected" functions. Disadvantages are that the library may be harder to maintain with a larger set of exposed functions and it could be hard to debug if multiple plugins are involved (how to know which plugin modified which function?). The library provides an "add plugin" function that accepts an object with a specific interface. Internally, the library will use the plugin instead of it's own code if appropriate. With this method, the internals of the library can be rearranged more freely as long as it still supports the same plugin interface. This could also support having different plugin interfaces to modify different parts of the library. A disadvantage of this method is that the plugins may have to re-implement code that is already part of the library since the library's internal functions are not exported. The library provides a "set implementation" function that accepts an object inherited from a specific base object. The library's public API calls functions in the implementation object for any functionality that can be modified and the base implementation object includes the core functionality, with both external (to the API) and internal functions. A plugin creates a new implementation object, which inherits from the base object and replaces any functions it wants to modify. This combines advantages and disadvantages of both the other methods.

    Read the article

  • What tools can be used to download all images in a webpage?

    - by bobo
    I would like to download all images in a web page. The tool should be smart enough to examine the css and javascript files in the page source to look for the images. Ideally, it should also replicate the folder hierarchy, saving the images in the correct folder. For example, the web page may have some images for menu items stored in images/menu/ and for background images it may be stored in images/bg/. Is there such a tool that you know of? (preferably in Windows but Linux is still ok) Many thanks to you all.

    Read the article

  • IIS 6 windows 2003 help installing SSL cert

    - by ADAM
    I requested a new ssl cert from godaddy which has been issued. When try to install it in iis through the website directory security tab i get a "the pending certificate request for this response file was not found. this request may be cancelled. you cannot install selected response certificate using this wizard" error. I may have run the wizard and deleted the pending request. Is there any way i can install the certificate without getting a new one? (i hope so) I have the original certrequest.txt file

    Read the article

  • Virtualization or Raw Metal?

    - by THE
    Normal 0 21 false false false DE X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-fareast-language:EN-US;} With the growing number of customers who want to run the Oracles EPM/BI (or other Fusion Middleware Software) in a virtualized environment, we face a growing number of people asking if running Oracle Software within VMware is supported or not. Two KM articles reflect Oracles policy towards the use of VMware: 249212.1 and 475484.1 . The bottom line is: “you may use it at your own risk, but Oracle does not recommend it”. So far we have seen few problems with the use of VMware (other than performance and the usual limitations) but Oracle does not certify its software for the use in VMware (and specifically for RAC Software actively refuses any support) and any issue that may occur will be fixed for the native OS only. It is on the customer to prove that the issue is NOT due to VMware in case that an issue is encountered. See: “Oracle Fusion Middleware Supported System Configurations page” And also “Supported Virtualization and Partitioning Technologies for Oracle Fusion Middleware”

    Read the article

  • Do you know how to move the Team Foundation Server cache

    - by Martin Hinshelwood
    There are a number of reasons why you may want to change the folder that you store the TFS Cache. It can take up “some” amount of room so moving it to another drive can be beneficial. This is the source control Cache that TFS uses to cache data from the database. Moving the Cache is pretty easy and should allow you to organise your server space a little more efficiently. You may also get a performance improvement (although small) by putting it on another drive.. Create a new directory to store the Cache. e.g. “d:\TfsCache\” Figure: Create a new folder Give the local TFS WPG group full control of the directory   Figure: You need to use the App Tier Service WPG In the application tier web.config (~\Application Tier\Web Services\web.config) add the following setting (to the appSettings section). Figure: The web.config for TFS is stored in the application folder <appsettings> ... <add value="D:\" key="dataDirectory" /> ... </appsettings> Figure: Adding this to the web.config will trigger a restart of the app pool Figure: Your web.config should look something like this The app pool will automatically recycle and Team Web Access will start using the new location.  If you then download a file (not via a proxy) a folder with a GUID should be created immediately in the folder from #1.  If the folder doesn’t appear, then you probably don’t have permissions set up properly.

    Read the article

  • Do you know how to move the Team Foundation Server cache

    - by Martin Hinshelwood
    There are a number of reasons why you may want to change the folder that you store the TFS Cache. It can take up “some” amount of room so moving it to another drive can be beneficial. This is the source control Cache that TFS uses to cache data from the database. Moving the Cache is pretty easy and should allow you to organise your server space a little more efficiently. You may also get a performance improvement (although small) by putting it on another drive.. Create a new directory to store the Cache. e.g. “d:\TfsCache\” Give the local TFS WPG group full control of the directory Figure: You need to use the App Tier service WPG In the application tier web.config (~\Application Tier\Web Services\web.config) add the following setting (to the appSettings section). <appsettings> ... <add value="D:\" key="dataDirectory" /> ... </appsettings> The app pool will automatically recycle and Team Web Access will start using the new location.  If you then download a file (not via a proxy) a folder with a GUID should be created immediately in the folder from #1.  If the folder doesn’t appear, then you probably don’t have permissions set up properly.

    Read the article

  • Dependency issue when installing some packages with apt

    - by Julien Genestoux
    We have a little trouble installing some php5-* packages. I am reallt not sure what is going on. We have the latest version of php5-common. # apt-get install php5-mcrypt Reading package lists... Done Building dependency tree Reading state information... Done Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: php5-mcrypt: Depends: php5-common (= 5.2.6.dfsg.1-1+lenny16) but 5.2.11-0.dotdeb.0 is to be installed E: Broken packages We get similar error with many other php5-* packages, like php5-imap.

    Read the article

  • Announcing Spacewalk Support for Oracle Linux Basic and Premier Customers

    - by Michele Casey
    Over the years, customers migrating to Oracle Linux have asked for options to provide a transitional solution for their existing system management tools (such as Red Hat Satellite Server) while evaluating and planning migrations to Oracle's Enterprise Manager, which is offered at no additional charge with Oracle Linux Support Subscriptions.  Based on this request, we are pleased to announce support for the open-source community project, Spacewalk, which is the basis for both Red Hat Satellite Server and SUSE Manager.  Effective today, customers with Oracle Linux Basic and Premier Support subscriptions have access to a fully supported Spacewalk build which can be setup to easily manage Oracle Linux systems.   Spacewalk support for Oracle Linux requires Oracle Linux 6, x86_64 for the server and provides support for Oracle Linux 5 and Oracle Linux 6 (x86, x86_64) clients.  This solution requires Oracle Database 11g Release 2 as the  supported database repository for Spacewalk with Oracle Linux.  Within the next several weeks, a limited use license for the Oracle Database will be included with this offer.  Until this is complete, customers may use an existing Oracle database license or they may begin by downloading a 30-day trial license from eDelivery.  Customers with Oracle Linux Basic and Premier subscriptions will automatically have access to the channel hosting the supported build.  Please review the release notes for further instructions. Oracle Enterprise Manager is still the recommended enterprise solution for managing Oracle Linux systems and we want to provide the easiest transition path for our customers.  We are excited to offer this solution to our Oracle Linux customers while they plan and implement their migration to Oracle Enterprise Manager. 

    Read the article

  • Apple's Java Mac OS X 2012-006 Update

    - by Sharon Zakhour
    The recent Java Mac OS X 2012-006 update from Apple removes the Apple Java 6 plug-in from your Mac. The Mac OS X Install FAQ will be updated with the next 7u release, but you may find the following information useful in the meantime. Q: I have installed Java for OS X 2012-006 and Apple Java 6 can no longer be used for applets or Web Start. How do I get it back? A: The Java for OS X 2012-006 update from Apple uninstalls the Apple-provided Java applet plug-in from all web browsers. You can download the latest version of Java from Oracle, which has improved security, reliability and compatibility. If you prefer to continue using Apple's Java 6 plug-in, you can follow the steps provided in How to re-enable the Apple-provided Java SE 6 applet plug-in and Web Start functionality. Q: After installing Java for OS X 2012-006, can I continue to use Apple's Java 6 alongside the OS X JDK or JRE for Java 7? A: If you want to continue to develop with Java 6 in a Terminal window you can modify the startup script for your favorite command environment. For bash, use this: export JAVA_HOME=`/usr/libexec/java_home -v 1.6` Some applications use /usr/bin/java to invoke Java. After installing Java for OS X 2012-006, /usr/bin/java will find the newest JDK installed, and will use that for all of the Java related command line tools in /usr/bin. You may need to modify those applications to find Java 6, or contact the developer for a newer version of the application. Also, this update removes Apple provided Java Preferences app. For more information on how to determine the default version of Java on your system, see Determining the Installed Version of the JRE in the JRE 7 Installation for Mac OS X page.

    Read the article

  • Lubuntu 12.04 on Acer laptop boots to blank blue screen

    - by WGCman
    My previous question on this was closed, but I am posting it again as the solution which my son eventually found may assist other users of the forum, or someone may be able to tweak the solution to improve the performance. Having installed Kubuntu 12.04.01 from a live USB onto my desktop, I wanted to do the same on my laptop, an Acer Aspire 1362 Laptop, which has 256MB RAM (actually 512 "on the box", but a good deal can be borrowed by the graphics!). I found Kubuntu wouldn't run on so little memory but downloaded: Lubuntu-12.04-alternate-i386.iso, which I understood was light enough to go. The laptop has one internal 40GB Toshiba hard drive divided into 3 partitions: C,19GB with Windows XP, Windows program files and some data, D, 19GB mostly data, and a small 2GB partition with some Acer software, which XP can't normally “see”. I transferred most of the contents of D to a memory stick, leaving 16GB free for Lubuntu. I did not want to dump XP yet, though it is painfully slow. I installed Lubuntu from then USB stick, accepting the default answers to most of the questions. The D: partition was further partitioned into a 500MB boot partition, 10GB for Linux, 2GB Swap and 6GB for data shareable between Linux and Windows. I had no error messages during installation, rebooted, was offered the choice of Ubuntu or XP, and selected the former. After a few minutes, I get a dark blue screen announcing Lubuntu with five dots underneath which lighten in turn. Eventually the lights stopped, and whatever I try the screen remains blank apart from “Lubuntu” I tried several solutions suggested on the forum for “identical” questions but without success.

    Read the article

< Previous Page | 88 89 90 91 92 93 94 95 96 97 98 99  | Next Page >