Search Results

Search found 15248 results on 610 pages for 'slashdot configuration'.

Page 385/610 | < Previous Page | 381 382 383 384 385 386 387 388 389 390 391 392  | Next Page >

  • HTML Manifest for Content Folios

    - by Kyle Hatlestad
    I recently worked on a project to create a custom content folio renderer in WebCenter Content. It needed to output the native files in the folio along with a manifest file in HTML format which would list the contents of the folio along with any designated metadata and a relative link to the file within the download.  This way a person could hand someone the folio download and it would be a self-contained package with all of the content and a single file to display the information on the contents.  The default Zip rendition of the folio will output the web-viewable version of the file with an HDA formatted file for each one. And unless you are fluent in HDA or have a tool to read them, they are difficult to consume. I thought this might be useful for others, so I'm posting a copy of the component here. Beyond the standard instructions for installing a component, there is an environment configuration file (folionativezipwithmanifestrenderer_environment.cfg) which has a couple of options. FolioMetadataManifestList - This is a comma separated list of metadata fields (system or custom) that should be included in the manifest file. FolioMetadataManifestUseOriginalFilename - (True or False) If set to True, the filenames in the zip file will be based on the original filename as it was checked into WebCenter Content.  If False, it will use the 'Name' of the item as defined within the Folio.  This is usually the Title of the item. The component also includes the source code, so feel free to use this as a reference for creating other interesting folios. 

    Read the article

  • Wireless performance on Ubuntu 9.10

    - by Brian
    Is there something I should do to my networking configuration in Ubuntu to better the performance of my wireless connection? I'm on a netbook dual-booting Windows 7 and Ubuntu 9.10. I pick up much stronger wifi signal when in Windows than Ubuntu. As soon as I boot Ubuntu, it will connect to the network with a stronger signal, and then loses signal very quickly. After it dies, I can't reconnect. I've tested this on a couple of different networks with the same outcome.

    Read the article

  • Why does Postfix deliver mails locally instead of relaying them to Google Apps?

    - by user40388
    I get the following error trying to send an email to my Google Apps Email at [email protected] from my Postfix server. to=, relay=local, delay=0.09, delays=0.07/0/0/0.02, dsn=5.1.1, status=bounced (unknown user: "admin") Is there a way I can force it to not use the LOCAL relay and treat [email protected] as outside email and not look for a user in the current postfix configuration. I am trying to email the full email address "[email protected]" not only "admin". I have the Google Apps MX record on mydomain.com + SPF record which before was: v=spf1 include:_spf.google.com ~all (emailing to [email protected] used to work with that record) But I had to change it to v=spf1 a mx ip4:MY.IP.HERE include:_spf.google.com ~all

    Read the article

  • Disable IPv6 on Debian VPS

    - by chris_l
    I have a Debian Lenny VPS, that's running virtualized by Parallels/Virtuozzo. Currently, the network interface doesn't have an IPv6 address - and that's good, because I don't have an ip6tables configuration. But I assume, that I could wake up one day, and ifconfig will show me an ipv6 address for the interface - because I have no control over the kernel or its modules - they're under the control of the hosting company. That would leave the server completely vulnerable to attacks from IPv6 addresses. What would be the best way to disable IPv6 (for the interface or maybe for the entire host)? Usually I would simply disable the kernel module, but that's not possible in this case.

    Read the article

  • Google Chrome with strange behavior

    - by user72274
    I'm former Chromium-browser user, but after not upgrading the PPA for 2 months, I switched to Google Chrome browser yesterday. Everything is okay, except some strange behavior on some pages and crashing after loading "chrome://" configuration pages. The best known website with strange behavior is youtube, there is a picture what I see: When I open user menu in top right corner, it crashes that way and even after closing the menu, some parts of menu stay display. You may say it's Youtube problem, no, I have this problem at least on three other websites, here it is on Imgur: The problem isn't for the whole side, sometimes it happens from the middle of the screen. The interesting part is that it happens everytime in the same distance from the right border. When I check the DOM elements with the Developer tool, the overlay which shows element's position is rendered how it should be. What is more, if there is anchor after the crashed area, it works after clicking on it. Selecting text in crashed page is impossible. I hope there is enough information to give me an advice, thanks in advance. :) EDIT: Here is what the browser posted in "chrome://gpu-internals/": Graphics Feature Status Canvas: Software only, hardware acceleration unavailable Compositing: Hardware accelerated 3D CSS: Hardware accelerated CSS Animation: Software animated. WebGL: Hardware accelerated WebGL multisampling: Hardware accelerated Problems Detected Accelerated CSS animation has been disabled at the command line. Accelerated 2d canvas is unstable in Linux at the moment. Ubuntu 12.04 | Gnome-shell 3.4.1 | ATI Radeon 4550 | Screen resolution 1024*768 | Chrome version 20.0.1132.57 (Official Build 145807)

    Read the article

  • error/message: the disk drive for /home is not ready... when connecting an external hard drive

    - by seallussus
    i am running Ubuntu 10.04 all updated installed to date (3/28/2012) and when i connect another sata HD i get this message the disk drive for /home is not ready yet or not present continue to wait, or press S to skip mounting or M for manual recovery so i press S but i get this message could not update ICEauthority file /home/username/.ICEauthority And when a press close i get this there is a problem with the configuration server (/usr/libconf2-4/gconf-sanity-check-2 exited with status 256) And when a press close i get message Nautilus could not create the following required folders :/home/username/desktop,/home/username/.Nautilus before running Nautilus, please create these folders or set permissons such that Nautilus can reach them And finally when a press Ok at the last message it disappears and i get a blank screen with a lot of colors on it and with nothing else so i shutdown (power button ) and disconnect the HD and boot without problems so how to fix this in simple commands because i am a total Linux noob notes -In the original error my username was in place of username i wrote (did not want to confuse anybody ) -I tried searching for this problem but a got a lot of different answers and most of them were really complicated to me and not working -I got a data HD connected and working without problems also the HD i installed Ubuntu on is Sata (maybe it helps) -Apologize for me bad English its not my mother language

    Read the article

  • DD-WRT with both https and sshd running in port 443

    - by Bruno
    I have a buffalo router with dd-wrt v24 SP2. After setting up the basic stuff, I enabled the https access to the admin page. Several days later, while setting up remote ssh, I changed the default port from 22 to 443. And now..well :) You got the picture :) I can ssh the router but I have no web access to it's admin page. Before rushing to a cold-hearted configuration reset, is there any way to change the ssh port from a shell? Or make dd-wrt accessible thru basic http?

    Read the article

  • Debian Squeeze can't install php-pear

    - by Lennier
    I use Debian 6.0.6 sudo apt-get install php-pear results in: Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: initscripts : Breaks: console-setup (< 1.74) but 1.68+squeeze2 is to be installed Breaks: initramfs-tools (< 0.104) but 0.98.8 is to be installed Breaks: nfs-common (< 1:1.2.5-3) but 1:1.2.2-4squeeze2 is to be installed keyboard-configuration : Breaks: console-setup (< 1.71) but 1.68+squeeze2 is to be installed klibc-utils : Breaks: initramfs-tools (< 0.103) but 0.98.8 is to be installed E: Broken packages How can i solve it?

    Read the article

  • Is A Managed Switch With VLAN Support Required

    - by Justin
    Hello, I am wondering if I need to buy a switch which is managed (VLAN support) for my configuration, or will a cheaper unmanaged switch work? I have servers with two NICS each. The first NIC is public and the second NIC is private. The router will plug into the switch port 1 let's say (public). Then server 1 public plugs into port 2 on the switch, and sever 1 private plugs into port 3 on the switch. The public interface is: 192.168.X.X / 255.255.0.0 and the private interface is 10.0.X.X / 255.255.0.0. So looks like: ** SWITCH ** Port Device Network 1 Router/Firewall 192.168.X.X 2 Server 1 Public 192.168.X.X 3 Server 1 Private 10.0.X.X 4 Server 2 Public 192.168.X.X 5 Server 2 Private 10.0.X.X 6 Server 3 Public 192.168.X.X 7 Server 3 Private 10.0.X.X Thanks.

    Read the article

  • Possible to redirect from HTTPS to HTTP behind load-balancer?

    - by Derek Hunziker
    I have a basic ASP.NET application that sits behind an F5 load-balancer. Incoming SSL requests (over HTTPS) terminate at the load-balancer and all internal communication between the load-balancer and my application servers is unsecure (over HTTP). When a unsecure request comes in, my app is able to use Response.Redirect("https://...") to redirect a secure URL with no problems. However, the other direction appears to be impossible - I cannot redirect from HTTPS to HTTP using Response.Redirect() from my application. The URL remains HTTPS for the client and does not change. Could the F5 be preventing the redirect for ever reaching the client? Is there any special configuration necessary to let this happen?

    Read the article

  • How to remove IE toolbar and menu bar

    - by Metallikanz
    We have a asp.net web application which will be used in an intranet environment on IE 6. We want to change the default configuration of the browser so that it's always rendered without the Tool Bars, Menu Bars and Address Bar, just the browser window frame and the status bar should be present. We were looking at the IEAK toolkit for IE6 but it doesn't seem to have the option of turning all this off though you can turn off certain menus and toolbar options. Any ideas of how this can be done, is there a group policy setting or something that we can utilize here to get this done? Thanks for your help.

    Read the article

  • What methods are there to configure puppet to serve resources for multiple environments?

    - by cclark
    I seem to come across two ways for using puppet in multiple environments: 1) Install a puppetmaster in each environment and only update the recipes from source control for that environment when ready to deploy the recipes in that environment. 2) Use one puppetmaster and use a variable in the puppet.conf of each client to specify the environment and then in the puppetmaster specify a different modulepath for each environment and each of those paths is updated to the branch of the recipe repository intended for that environment (e.g. dev, staging, production). Only running one puppetmaster seems like it is one less piece of infrastructure to keep running but there is some additional complexity in the configuration. Are there additional pros or cons to one of these methods or something which I'm missing entirely?

    Read the article

  • Windows Internet Connection sharing with Mobile Broadband

    - by PaoloFCantoni
    Due to circumstances, I have only got mobile broadband where I am living. I have a small network with a ADSL Router (but which isn't connected to the Internet. I want to use ICS to allow one machine (with the MBB modem) to act as the Internet interface and allow other machines connected to the ADSL router (including a new Android tablet by WiFi) to use the single mobile broadband connection. I've a feeling that my configuration is not valid - as it stands, but I'm not sure. Can some kind soul lead me "by the nose" to getting this working? FWIW The mahcines are all running Windows 7 TIA, Paolo

    Read the article

  • Kubuntu 11.10 Lot of Networking problems

    - by Cobraone
    Since I upgrade to 11.10 I have a lot of problems with KDE. First of all there are problems in configuring a static IP address. Just to explain @home I have a normal fiber ADSL and I use a DHCP. When I go to a customer I must insert a static IP address. With ifconfig everything seems ok but there is something wrong in searching DNS names. (I have installed Ubuntu and was going ok again). Now I Have reinstalled again Kubuntu 11.10 and I have the same problem in addition today I have discovered that if I connect to a network in another customer office the desktop freezes and I could only switch between windows with alt+tab. No FN key or right click to open run command works. So i unplugged network (configuration is just DHCP here) and tried on another position in office. It was the same. My Laptop freezes when connected, a fedora 14 of a friend works. So I decided to connect my Galaxy S II as USB network device. Everything is ok for like 3 minutes. When I noticed a little loss of signal again the desktop freezes and i must work (like now) just switching between windows with alt+tab). Additional information: Unplugging network or restarting it via Konsole does not not solve the freezing problem. Every time I must open a console and reboot. Any idea of what tests to do ? Just a recommendation: If I must post here logs or something else please guide me. I use Linux since Ubuntu 9 but I am not an "expert".

    Read the article

  • Several border firewalls in the same network

    - by nimai
    I'm currently analyzing the consequences of multipath connections for the firewalls. In that context, I'm wondering if it's really uncommon to have several firewalls at the borders of a network to protect it. The typical case I'd imagine would be a multihomed network, for which the administrator would have different policies for links from different (or not) ISPs. Or maybe even in an ISP's network. What would be the practical (dis)advantages of such a configuration? Could you provide an example of an existing topology using several border firewalls?

    Read the article

  • Slight stuttering when moving windows in fresh 12.04 install

    - by Konsolkongen
    Installed Ubuntu 12.04 today and my problem is that when I'm moving the windows around my screen it doesn't feel smooth at all. Usually I can fix this by changing the refresh rate to 60Hz, but this time it doesn't help. My graphics card is a Nvidia GTX 560Ti and I've tried both the 295.40, 295.45 and 304.43 (which I'm currently using) but neither has resolved my problem. I searched around a bit and tried changing the refresh-rate using compizconfig-settings-manager and xrandr. No change using CCSM, but when I tried xrandr I got this reply: konsolkongen@konsolkongen-desktop:~$ xrandr -r 60Rate 60.0 Hz not available for this size - which is nonsense of course. This is what my xorg.conf file looks like: # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 295.33 (buildd@allspice) Fri Mar 30 15:25:24 UTC 2012 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: edid, VertRefresh source: edid Identifier "Monitor0" VendorName "Unknown" ModelName "Samsung SyncMaster" HorizSync 30.0 - 81.0 VertRefresh 56.0 - 75.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce GTX 560 Ti" EndSection Section "Screen" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "0" Option "TwinViewXineramaInfoOrder" "DFP-0" Option "metamodes" "DFP-0: 1680x1050_60 +0+0" SubSection "Display" Depth 24 EndSubSection EndSection Any help would be greatly appreciated, my obsession with video quality can't stand stuttering like this. For what it's worth though, I don't have any screen tearing, so at least V-sync is on. Thanks.

    Read the article

  • snmptrap and snmptt - authcommunity not found

    - by sabs6488
    I am trying to configure snmptt to translate the snmp traps received and handle them as passive checks in icinga monitoring server as described here . after doing the changes to the snmptrapd.conf . I am trying to restart the service and I see authcommunity : command not found, traphandle : command not found . my understanding is the authcommunity and traphandle are just configuration directives which will tell snmptrapd about the community string to use and the traphandle script to be called. It would be helpful if someone can help me understand better. Thanks, sabs

    Read the article

  • When adding second processor to SQL Server, will it automatically balance the load?

    - by ddavis
    We have a SQL Server 2008 R2 (10.5) on a dedicated box with a single 2.4Ghz processor, which regularly runs at 70-80% CPU. We are going to be adding a significant number of users to the application and therefore want to add a second processor to the box (scale up). Will SQL Server automatically use the second processor to balance threads, or is there additional configuration that will need to be done? In other words, will adding the second processor drop my CPU usage to 35-40% per CPU, automatically balancing the load? Based on what I read here, it seems that it will: http://msdn.microsoft.com/en-us/library/ms181007.aspx However, I've read elsewhere that CPU performance gains can be made by assigning database tables to different filegroups, but I'm not sure we want to get that complicated at this point.

    Read the article

  • Safety concerns on allowing connections to MySQL with no password on localhost?

    - by ÉricO
    In the case of a Linux system, is there any security concern to let MySQL users with standard privileges (that is, not the root users) connect to the database with no password from localhost? I think that enforcing a password even for localhost can add a layer of protection, since, with no password the database access would be compromised if the SSH access is itself compromised. Considering that, would it be less safe to allow no password connection to MySQL than having the same password for SSH and for MySQL? I don't know if that is to be taken into account, but we also use phpMyAdmin to let users administrate their own database. I am asking because I kinda dislike having to put our database passwords unencrypted in the source or configuration files of our applications, where they can easily be leaked unintentionally. Since our servers are configured to run our applications as the Linux user the application belongs to, I was considering allowing no password from localhost as a simple solution. So, would that be a very bad idea or not?

    Read the article

  • How to make sure clients update their browser cache when my website is updated?

    - by user64204
    I am using the HTTP 1.1 Cache-Control header to implement client-side caching. Since I update my website only once a month I would like the CSS and JS files to be cached for 30 days with Cache-Control: max-age=2592000. The problem is that the 30-day period defined by Cache-Control doesn't coincide with the website update cycle, it starts from the moment the users visit the site and ends 30 days later, which means an update could occur in the meantime and users would be running with outdated content for a while, which could break the rendering of the website if for instance the HTML and CSS no longer match. How can I perform client-side caching of content for periods of several days but somehow get users to refresh their CSS/JS files after the website has been updated? One solution I could think of is that if website updates can be schedule, the max-age returned by the server could be decreased every day accordingly so that no matter when people visit the website, the end of caching period would coincide with the update of the website, but changing the server configuration every day goes against one of my sysadmin principles (once it's running, don't touch it).

    Read the article

  • Wireless Connection unstable with multiple devices connected

    - by KingIsulgard
    My wireless network works perfectly when only 1 device is connected. Super fast, full strength. But as soon as multiple devices are connected to the wireless network the connections become unstable (constantly losing connection with the internet, not the network itself). It's quite annoying. I have a Sitecom Wireless 300N XR Gigabit Router WL-306, which should be a decent router so I'm guessing there must be something wrong with my configuration. Does any of you know what might cause this? Thanks

    Read the article

  • IIS and PHP restrict IO permissions

    - by ULTRA_POROV
    I have php installed trough a fastCGI module. Is there a way to restrict the module (php.exe) read / write permissions to only the directory (+ subdirs) of the IIS site that is calling it? I need this to prevent one IIS PHP site from having access to files outside its own directory. How to do this? Is there a setting in php.ini or in the IIS configuration? I believe such a feature could exist, because when a file on the server is requested the root path of the site is also known, all it would take is that IIS passes this path to the php module, and the php module should on its end allow only IO operations within this path. PS: I know it is possible to achieve this by using a different windows account for each website, this is not an option.

    Read the article

  • SNMP - So I have a MIB. Now What?

    - by senfo
    I can't seem to get my head wrapped around the purpose of a MIB. I have a collection of ~20 MIB files that were supplied to me by the vendor, but what do I do with them? I also have a few OID's that were supplied by the vendor that don't seem to be valid. When I issue an "snmpget -v1 -c public 192.168.0.123 .1.4.6.3.2.6.2" (assume that's a valid OID), I get an error indicating the variable is unknown. Does this sound like a hardware configuration problem? Do I need to "load" (for lack of better words) the MIB into the device? Unfortunately, the vendor has been completely unresponsive with returning emails to my questions, so any help would be greatly appreciated.

    Read the article

  • Taking two actions in monit

    - by Oddthinking
    My monit script works to detect an outage with a process and inform me when the rule is: IF DOES NOT EXIST THEN ALERT My monit script works to detect an outage and automatically fix it when the rule is: IF DOES NOT EXIST THEN START But, what I want it to do is inform me AND fix it. Two rules in a row seems to make it ignore all but the last: IF DOES NOT EXIST THEN ALERT IF DOES NOT EXIST THEN START # No alert given. I could use a custom script that does both, and IF DOES NOT EXIST THEN EXEC "my_handwritten_script" but I was trying to move away from a mess of hand-written scripts towards a clean Monit configuration. Can I configure Monit to take two actions?

    Read the article

  • x.265 in ffmpeg

    - by Levan
    Today I found out that x265 is already present in ffmpeg so I compiled ffmpeg with this guide Sadly libx265 did not work on ubuntu, however on windows I tried the same thing with zeranoe ffmpeg build and it worked without a problem. So do you think i did something wrong or it is not yet implemented in linux build (using that guide)? The results of the command ffmpeg -codecs | grep -i hevc show: ffmpeg version 2.1.git Copyright (c) 2000-2014 the FFmpeg developers built on Feb 19 2014 19:00:17 with gcc 4.8 (Ubuntu/Linaro 4.8.1-10ubuntu9) configuration: --prefix=/home/levan/ffmpeg_build --extra-cflags=-I/home/levan/ffmpeg_build/include --extra-ldflags=-L/home/levan/ffmpeg_build/lib --bindir=/home/levan/bin --extra-libs=-ldl --enable-gpl --enable-libass --enable-libfdk-aac --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-nonfree --enable-x11grab libavutil 52. 64.100 / 52. 64.100 libavcodec 55. 52.102 / 55. 52.102 libavformat 55. 33.100 / 55. 33.100 libavdevice 55. 10.100 / 55. 10.100 libavfilter 4. 1.102 / 4. 1.102 libswscale 2. 5.101 / 2. 5.101 libswresample 0. 17.104 / 0. 17.104 libpostproc 52. 3.100 / 52. 3.100 D.V.L. hevc H.265 / HEVC (High Efficiency Video Coding) Thank you for your time

    Read the article

< Previous Page | 381 382 383 384 385 386 387 388 389 390 391 392  | Next Page >