Search Results

Search found 37152 results on 1487 pages for 'shared access'.

Page 264/1487 | < Previous Page | 260 261 262 263 264 265 266 267 268 269 270 271  | Next Page >

  • How do I access and enable more icons to be in the system tray?

    - by Jon
    So I'm messing around with Natty a little, and I noticed that all the apps that would normally use the system tray (or "notification area"?) aren't displaying there. Is that a bug, or is that the way it's going to be? I heard something about Ubuntu getting rid of that feature entirely. Is there a way to add it back? I mean, I didn't really like it, either, especially when there were apps that used it unnecessarily, but I can't use CryptKeeper at all now, or easycrypt, and I don't know whether Dropbox has synced without opening Nautilus.

    Read the article

  • How do I know which file a program is trying to access?

    - by user9069
    I have a program which I am trying to run, however when I run it; it just complains that it can't find a particular file. However I have no idea which folder it is trying to find this particular file in. I have a copy of the required file, I just need to know which folder to copy it too. Is there any way to show in real time which files are being accessed or which files are trying and failing to be accessed? I am using Ext4 filesystem if that helps. Thanks

    Read the article

  • Why there suddenly were so many 400 request in my access log?

    - by LotusH
    Below are little part of my access_log 118.186.8.50 - - [19/Dec/2011:22:42:57 +0800] "-" 400 0 "-" "-" 05 118.186.8.50 - - [19/Dec/2011:22:42:57 +0800] "-" 400 0 "-" "-" 06 118.186.8.50 - - [19/Dec/2011:22:42:57 +0800] "-" 400 0 "-" "-" 07 118.186.8.50 - - [19/Dec/2011:22:42:57 +0800] "-" 400 0 "-" "-" 08 118.186.8.50 - - [19/Dec/2011:22:42:57 +0800] "-" 400 0 "-" "-" 09 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" 10 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" 11 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" 12 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" 13 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" 14 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" And the volume was very huge, some like one hundred thousand of these 400 request per second. And I'm pretty sure there are no errors on my site in that period of time.(No error report and I didn't change the source code)

    Read the article

  • How can I access bitmaps created in another activity?

    - by user22241
    I am currently loading my game bitmaps when the user presses 'start' in my animated splash screen activity (the first / launch activity) and the app progresses from my this activity to the main game activity, This is causing choppy animation in the splashscreen while it loads/creates the bitmaps for the new activity. I've been told that I should load all my bitmaps in one go at the very beginning. However, I can't work out how to do this - could anyone please point me in the right direction? I have 2 activities, a splash screen and the main game. Each consist of a class that extends activity and a class that extends SurfaceView (with an inner class for the rendering / logic updating). So, for example at the moment I am creating my bitmaps in the constructor of my SurfaceView class like so: public class OptionsScreen extends SurfaceView implements SurfaceHolder.Callback { //Create variables here public OptionsScreen(Context context) { Create bitmaps here } public void intialise(){ //This method is called from onCreate() of corresponding application context // Create scaled bitmaps here (from bitmaps previously created) }

    Read the article

  • How do I securely share / allow access to a drive?

    - by sleske
    To simplify backing up a laptop (Windows Vista), I'm planning on sharing its C: drive (with password protection) and using that to back it up from another computer. What are the security implications of this? If I share C: with a reasonable password, how big is the risk of compromise if the system is e.g. inadvertently used on a public WLAN or similar? Background: I'm planning to use [Areca Backup][1] to back up two systems (Windows XP and Vista). My current plan is to install Areca on the XP box, and share the Vista system's C: as a shared folder, so the XP system can read it. Then I can set up the drive as a network drive and have Areca read it like a local drive. Of course, if you can think of a more elegant way of doing this, I'm open to suggestions.

    Read the article

  • Sharing Windows Folders on a Network... other PCs see but can't access

    - by John
    I'm soooo tired of network setup issues. All I want to do is share a folder and all it's sub-folders so other PCs on my network can view and change this remote location. Why is it that setting a dir to "shared" doesn't actually make it usable in any way? The other PC can see the fodler but is unable to actually open it and look inside. It seems every time I want to do this I go through some semi-random process of right clicking the folder and enabling sharing, then looking in the folder properties to add permissions and other sharing... and then I end up with some folders working but others will randomly block permission on certain files or sub-dirs. I have 5 PCs in my local testing network and I cannot believe it should be this complicated... where is the simple "make this folder work on the network" option?! I have a mixture of XP, Vista & W7 machines, but this seems common to all of them.

    Read the article

  • How do I access my remote Ubuntu server via X-windows from my Mac?

    - by Magooda
    I have an Ubuntu server (12.04 LTS) running remotely on a cloud hosting service. I have installed ubuntu-desktop via apt-get: $ sudo apt-get install ubuntu-desktop It appears to have installed no problem. I have confirmed that /etc/ssh/ssh_config on the server contains the lines ForwardAgent yes ForwardX11 yes ForwardX11Trusted yes and that /etc/ssh/sshd_config on the server contains the line: X11Forwarding yes I then rebooted the server. It came back up no problem. Now, starting X11.app on my Mac I am presented with an Xterm. I connect to my server from this terminal using: $ ssh -X <myhost> and I connect to the server, no problem. At this point I don't know what to do. I have tried $ sudo startx but I get a "no screens found" error. I don't have screens because its a headless cloud server, but I just want to acces it from my Mac through X. What now?

    Read the article

  • How can I access form elements when using an ASP.NET MVC Ajax form?

    - by Patrick
    I've got an ajax form in an MVC 2 application. I cannot find the proper way to access the form elements within the Ajax form decleration. I can access the name of the elements with Request.Form.Keys but I can't access the actual values. I've read numerous examples of posting forms with jQuery but my form has elements created dynamically based on route values (sometimes it could be 2 text boxes sometimes 10, given unique names like so: <%= Html.TextBox("Evaluation"+Model.EvaluationId.ToString())) so I couldn't find a way to make that work with jQuery. Is there another way that I for elements can be accessed?

    Read the article

  • My Folders Become Hidden System Files And Access Denied?

    - by echolab
    I just aske a question in Superuser site , i have an external hdd which suddenly seems infected and one of my folders which contain my photos changed to something like this One of superuser guys suggest that i install an ubuntu and try to scan and change permissions , but i am not an ubuntu expert , and believe me if my problem solve i will switch to it ( cause all i do is writing and taking photos i am tired of malewares and etc ) Now i have ubuntu and fedora installed ( after a long read through guides ) , both of them show infected folder as empty ( in windows 7 i see this strange folders , you can see in picture )

    Read the article

  • Consolidation in a Database Cloud

    - by B R Clouse
    Consolidation of multiple databases onto a shared infrastructure is the next step after Standardization.  The potential consolidation density is a function of the extent to which the infrastructure is shared.  The three models provide increasing degrees of sharing: Server: each database is deployed in a dedicated VM. Hardware is shared, but most of the software infrastructure is not. Standardization is often applied incompletely since operating environments can be moved as-is onto the shared platform. The potential for VM sprawl is an additional downside. Database: multiple database instances are deployed on a shared software / hardware infrastructure. This model is very efficient and easily implemented with the features in the Oracle Database and supporting products. Many customers have moved to this model and achieved significant, measurable benefits. Schema: multiple schemas are deployed within a single database instance. The most efficient model, it places constraints on the environment. Usually this model will be implemented only by customers deploying their own applications.  (Note that a single deployment can combine Database and Schema consolidations.) Customer value: lower costs, better system utilization In this phase of the maturity model, under-utilized hardware can be used to host more workloads, or retired and those workloads migrated to consolidation platforms. Customers benefit from higher utilization of the hardware resources, resulting in reduced data center floor space, and lower power and cooling costs. And, the OpEx savings from Standardization are multiplied, since there are fewer physical components (both hardware and software) to manage. Customer value: higher productivity The OpEx benefits from Standardization are compounded since not only are there fewer types of things to manage, now there are fewer entities to manage. In this phase, customers discover that their IT staff has time to move away from "day-to-day" tasks and start investing in higher value activities. Database users benefit from consolidating onto shared infrastructures by relieving themselves of the requirement to maintain their own dedicated servers. Also, if the shared infrastructure offers capabilities such as High Availability / Disaster Recovery, which are often beyond the budget and skillset of a standalone database environment, then moving to the consolidation platform can provide access to those capabilities, resulting in less downtime. Capabilities / Characteristics In this phase, customers will typically deploy fixed-size clusters and consolidate on a cluster until that cluster is deemed "full," at which point a new cluster is built. Customers will define one or a few cluster architectures that are used wherever possible; occasionally there may be deployments which must be handled as exceptions. The "full" policy may be based on number of databases deployed on the cluster, or observed peak workload, etc. IT will own the provisioning of new databases on a cluster, making the decision of when and where to place new workloads. Resources may be managed dynamically, e.g., as a priority workload increases, it may be given more CPU and memory to handle the spike. Users will be charged at a fixed, relatively coarse level; or in some cases, no charging will be applied. Activities / Tasks Oracle offers several tools to plan a successful consolidation. Real Application Testing (RAT) has a feature to help plan and validate database consolidations. Enterprise Manager 12c's Cloud Management Pack for Database includes a planning module. Looking ahead, customers should start planning for the Services phase by defining the Service Catalog that will be made available for database services.

    Read the article

  • Samba issue with sharing directories on NTFS/FAT32 (Mounted Drives) ???

    - by Microkernel
    Hi guys, I have some strange problems with Samba server. I am using samba Version 3.5.4 on Ubuntu 10.10. I have two windows-xp machines, one on VirtualBox on Ubuntu and another office laptop. Windows machine on VBox has no issues in accessing the shared folders, but the laptop is not able to access all the shared content. The issue faced on laptop is = Shared folders on Ext3 drives have no issues in accessing, but the contents shared on NTFS and FAT32 drives (mounted ones) are not accessible. When I try to open the shared folder, it asks for user name and password, but doesn't accept when I provide it. (even if I provide admin login details!!!). I changed workgroup value to the domain_name in office laptop, but still the problem persists... Here is the smdb.conf I am using... [global] workgroup = XXX.XXX.ORG server string = %h server (Samba, Ubuntu) map to guest = Bad User obey pam restrictions = Yes pam password change = Yes passwd program = /usr/bin/passwd %u passwd chat = *Enter\snew\s*\spassword:* %n\n *Retype\snew\s*\spassword:* %n\n *password\supdated\ssuccessfully* . unix password sync = Yes syslog = 0 log file = /var/log/samba/log.%m max log size = 1000 dns proxy = No usershare allow guests = Yes panic action = /usr/share/samba/panic-action %d guest ok = Yes [homes] comment = Home Directories [printers] comment = All Printers path = /var/spool/samba read only = No create mask = 0700 printable = Yes browseable = No [print$] comment = Samba server's CD-ROM path = /cdrom force user = nobody force group = nobody locking = No Workgroup Was defined as "HOMENET" before, changed it to domain name on the office laptop thinking it was the problem, but for no avail Thanks in advance Regards, Microkernel

    Read the article

  • How to (legitimately) access files after putting self into chrooted sandbox?

    - by unknown google user
    Changing a Linux C++ program which gives the user limited file access. Thus the program chroots itself to a sandbox with the files the user can get at. All worked well. Now, however, the program needs to access some files for its own needs (not the user's) but they are outside the sandbox. I know chroot allows access to files opened before the chroot but in this case the needed files could a few among many hundreds so it is obviously impractical to open them all just for the couple that might be required. Is there any way to get at the files?

    Read the article

  • How to access Session values from layers beneath the web application layer.

    - by Matthew Vines
    We have many instances in our application where we would like to be able to access things like the currently logged in user id in our business domain and data access layer. On log we push this information to the session, so all of our front end code has access to it fairly easily of course. However, we are having huge issues getting at the data in lower layers of our application. We just can't seem to find a way to store a value in the business domain that has global scope just for the user (static classes and properties are of course shared by the application domain, which means all users in the session share just one copy of the object). We have considered passing in the session to our business classes, but then our domain is very tightly coupled to our web application. We want to keep the prospect of a winforms version of the application possible going forward. I find it hard to believe we are the first people to have this sort of issue. How are you handling this problem in your applications?

    Read the article

  • How to prevent a 404 Error when creating a subdomain and using www to access it?

    - by Chris
    I have installed a multi-site installation of WordPress onto my domain. I then added the necessary code to the wp-config.php file and .htaccess as instructed by WordPress. I also installed a plugin called Quick Page/Post Redirect Plugin which allowed me to place a 301 redirect onto the main domain as I only want to use the sub domain and not the main domain. Then I also added the following line of code to the wp-config.php file to redirect the main domain define( 'NOBLOGREDIRECT', 'URL Redirect Address' ); The site works fine with a redirect on the main domain and my subdomain runs fine when you type in subdomain.example.com or http://subdomain.example.com. However when I enter www.subdomain.example.com or http://www.subdomain.example.com the following error message is returned: Not Found The requested URL / was not found on this server. Apache/2.4.9 (Unix) Server at www.subdomain.domain.com Port 80 Any help with this would be much appreciated.

    Read the article

  • How can I access old KMail emails after upgrading to Ubuntu 11.10?

    - by Gary Kleppe
    I initially installed Ubuntu 11.04 and used KMail for my email. All well and good. Then I upgraded to Ubuntu 11.10. Presumably an upgrade of KMail took place as part of this. Now KMail won't even run; when I try, it tells me "Failed to fetch the resource collection" and crashes. I don't mind switching to another email client, but I'd very much like to be able to recover all of the emails I have stored in KMail. Any suggestions on how to do this? Thanks for any help anyone can provide.

    Read the article

  • How to access / query Team Foundation Server 2012 with Odata?

    - by cseder
    I've tried to find a solution for this for hours now, and I'm getting the same results in the end, asking me to install a lot of Azure and other stuff, plus running some example project .sln that I can't open with my 2012 version of Visual Studio. So, I'm pretty much stuck, and have some pretty straight forward questions regarding this: Does TFS 2012 include the Odata service in any way, so that I don't have to install it? If not, how can I install a NATIVE 2012 version of the Odata service for TFS 2012? Is it possible that I'm aiming for the wrong target here? I'm looking for a solution to the following: I have a TFS 2012 Server that I need to be able to create Work Items on programatically, based on data from our Help Desk system. Then I need to query these Work Items for changed status since its creation, and update the Help Desk Database. Am I better off using the "regular" TFS API? I was kinda thinking that the Odata way was more "future proof", but I'm not sure...

    Read the article

  • How Does Windows Confirm Wi-Fi Access and Whether Hot Spot Authentication Is Necessary?

    - by Jason Fitzpatrick
    Windows is quite adept at telling you if you have a properly functioning Internet connection, but how exactly does it do so? Digging into how Windows handles the problem offers insight into Windows connectivity messages. Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-drive grouping of Q&A web sites. How to Fix a Stuck Pixel on an LCD Monitor How to Factory Reset Your Android Phone or Tablet When It Won’t Boot Our Geek Trivia App for Windows 8 is Now Available Everywhere

    Read the article

  • How can I configure my windows service in the code to access the desktop?

    - by Pankaj
    I have created an windows service. I want to open some windows based application from this service. But my windows service is unable to start desktop applications. To enable the access I had to do the following steps: Opened the administrative tool "Services" Right clicked on my service and had to select "properties" Then in the "Log On" tab, selected "Allow service to interact with desktop". After that my service can open desired windows based processes. Can I configure my windows service in the code (C#) to access the desktop so that I won't have to change the access permission manually after installation?

    Read the article

  • Shared WCF client code between .NET and Silverlight apps?

    - by Eduardo Scoz
    I'm developing a .NET application that will have both a WinForms and a Silverlight client. Although the majority of code will be in the server, I'll need to have quite a bit of logic in the clients as well, and I would like to keep the client library code the same. From what I could figure out so far, I need to have two different project types, a class library and a Silverlight class library, and link the files from one project to the other. This seems kind of lame, but it works for simple code. My problem, though, is that the code generated by the SVCUtil.exe to access WCF services is different from the code generated by the slsvcutil.exe, and the silverlight code is actually incompatible with the .NET one: I get a bunch of problems with the System.ServiceModel.Channel classes when I try to import the class into .NET. Has anybody done anything similar to this before? What am I doing wrong?

    Read the article

  • How and when to log account access login with PHP?

    - by Nazgulled
    I want to implement a basic login system in some PHP app where no cookies will be involved. I mean, the user closes the browser and the login expires, it will remain active during the browser session (or if the user explicitly logs out) otherwise. I want to log all this activity and I'm thinking that every time the user refreshes the page, opens a different link or logs out, I record that time as the last access made by that user, overwriting the previous access log. But my problem is when and how should I insert another record into the database instead of overwriting the last one? Should I just define a timeout and if the last access was made above that timeout, another log should be inserted into the database? Should the session expire too after that timeout? Or is there a better way? Ideally, I would like to log the "log out action" when the browser was closed, but I don't think there's a way to detect that is there? Suggestions?

    Read the article

  • How can I make a case for "dependency management"?

    - by C. Ross
    I'm currently trying to make a case for adopting dependency management for builds (ala Maven, Ivy, NuGet) and creating an internal repository for shared modules, of which we have over a dozen enterprise wide. What are the primary selling points of this build technique? The ones I have so far: Eases the process of distributing and importing shared modules, especially version upgrades. Requires the dependencies of shared modules to be precisely documented. Removes shared modules from source control, speeding and simplifying checkouts/check ins (when you have applications with 20+ libraries this is a real factor). Allows more control or awareness of what third party libs are used in your organization. Are there any selling points that I'm missing? Are there any studies or articles giving improvement metrics?

    Read the article

  • How do I install graphviz 2.29 in 12.04?

    - by bidur
    In my ubuntu 12.04, the graphviz is not the latest version(2.29). I need some features available in the latest version of graphviz. I tried to install the graphviz version 2.29, which requires libgraphviz4(=2.18). I anyhow installed libgraphviz4 and installed graphviz 2.29. For that I have to remove packages libcdt4 and libpathplan4. Now whenever I try to generate graph, I get some problems: For e.g.: dot -Kfdp -n -Tpng -o samplePOS.png forcePOS.dot It says: dot: error while loading shared libraries: libgvc.so.6: cannot open shared object file: No such file or directory neato -Tps -o sample_1.ps sourcedot.gv It says: neato: error while loading shared libraries: libgvc.so.6: cannot open shared object file: No such file or directory So, I am looking for some ways so that I can run graphviz 2.29 in my ubuntu 12.04.

    Read the article

  • are runtime linking library globals shared among plugins loaded with dlopen?

    - by conejoroy
    I've a C++ program that links at runtime with, lets say, mylib.so. then, the same program uses dlopen()/dlsym() to load a function from myplugin.so, dynamic library that in turn has dependencies to mylib.so. My question is: will the program AND the function in the plugin access the same globals defined in mydlib.so in the same memory area reserved for the program, or each will be assigned different, unrelated copies in its own memory space? if the latter is the default behaviour, is it possible to change that? Thanks in advance =)!

    Read the article

  • Permissions issue: how can Apache access files in my Home directory?

    - by richzilla
    I know file permissions have been covered on here before, but im struggling to get my head around the concept for my scenario. I created the files on an old ubuntu installation. Ive copied the files into my new ubuntu installation and put them in my webroot. When i attempt to run the files (theyre PHP files) i get an error relating to permissions in an attempt to fix this, i assumed that they must still be owned by the previous owner, so i ran chown -R on the directory, with my username as an argument, in order to take ownership of all of the files in the directory. It should be noted that the usernames between new and old ubuntu installations were the same. When i attempt to run the files again, same problem: 500 error due to permissions problems. Can anyone tell me what other steps i should take? The webroot for my apache installation is inside my home folder. If i create new files in my webroot, they also work as expected, its only the old files that are causing the problem.

    Read the article

  • can rsyslog transfer files present in a directory

    - by Tarun
    I have configured rsyslog and its working fine as its intended to be this is the conf files: Server Side: $template OTHERS,"/rsyslog/test/log/%fromhost-ip%/others-log.log" $template APACHEACCESS,"/rsyslog/test/log/%fromhost-ip%/apache-access.log" $template APACHEERROR,"/rsyslog/test/log/%fromhost-ip%/apache-error.log" if $programname == 'apache-access' then ?APACHEACCESS & ~ if $programname == 'apache-error' then ?APACHEERROR & ~ *.* ?OTHERS Client Side: # Apache default access file: $ModLoad imfile $InputFileName /var/log/apache2/access.log $InputFileTag apache-access: $InputFileStateFile stat-apache-access $InputFileSeverity info $InputRunFileMonitor #Apache default Error file: $ModLoad imfile $InputFileName /var/log/apache2/error.log $InputFileTag apache-error: $InputFileStateFile stat-apache-error $InputFileSeverity error $InputRunFileMonitor if $programname == 'apache-access' then @10.134.125.179:514 & ~ if $programname == 'apache-error' then @10.134.125.179:514 & ~ *.* @10.134.125.179:514 Now in rsyslog can I instead of defining separate files can I give the complete directory so that the client sends all the log files automatically present in the directory /var/log/apache2 and on syslog server side these files gets automatically stored in different filenames?

    Read the article

< Previous Page | 260 261 262 263 264 265 266 267 268 269 270 271  | Next Page >