Search Results

Search found 7078 results on 284 pages for 'extended range'.

Page 146/284 | < Previous Page | 142 143 144 145 146 147 148 149 150 151 152 153  | Next Page >

  • TP-Link TL-WA701N not working good as wireless extender

    - by djechelon
    I bought the device in subject to extend the range of my WPA2/PSK-protected wifi network powered by a TP-Link TL-WR340G device (AP+router). I configured it as follows: Operation mode: Universal Repeater MAC of AP: scanned my SSID and got it Channel width: 20MHz Security options: the same as the parent AP (WPA2/PSK with AES encryption) After configuration inSSIDer shows me two APs beaconing the same SSID at different SNRs (because I was with my laptop close to the extender). After a few hours my tablet, far from the parent AP, stopped working. I found that the scan reported two networks with the same SSID: one WPA-protected and one free at all. This happened very frequently. Rebooting the extender by unplugging it worked but this doesn't last long. Sometimes the extender stops transmitting at all, sometimes it beacons an open network to which nobody can connect (because there is no DHCP). What's wrong with my configuration?

    Read the article

  • Comparison of cloud hosting providers

    - by Abel
    Is there a place where we can compare* the many new arising cloud hosting providers? From reading into each of them, they seem very different and range from just hosting applications (google) to a semi-full enterprise web serving framework (rackspace). Comparing "by hand" takes a lot of time. All have limitations and different prizing, but which are those and how do they compare? I'm looking for an unbiased comparison site, rather then a discussion on "which is the best". * I don't mean a hosting provider comparison site, of which there are many. The properties of cloud hosting providers are remarkably different and don't compare well on classical hosting provider comparison charts.

    Read the article

  • How to avoid OLEDB converting "."s into "#"s in column names?

    - by Andrew Miner
    I'm using the ACE OLEDB driver to read from an Excel 2007 spreadsheet, and I'm finding that any '.' character in column names get converted to a '#' character. For example, if I have the following in a spreadsheet: Name Amt. Due Due Date Andrew 12.50 4/1/2010 Brian 20.00 4/12/2010 Charlie 1000.00 6/30/2010 the name of the second column would be reported as "Amt# Due" when read with the following code: OleDbConnection connection = new OleDbConnection( "Provider=Microsoft.ACE.OLEDB.12.0; Data Source={0}; " + "Extended Properties=\"Excel 12.0 Xml;HDR=YES;FMT=Delimited;IMEX=1\""); OldDbCommand command = new OleDbCommand("SELECT * FROM MyTable", connection); OleDbReader dataReader = command.ExecuteReader(); System.Console.WriteLine(dataReader.GetName(1)); I've read through all the documentation I can find and I haven't found anything which even mentions that this will happen. Has anyone run into this before? Is there a way to fix this behavior?

    Read the article

  • How can I setup a subnetwork using DDWRT and a 2Wire?

    - by MadBurn
    Ok, I'm having a bit of trouble getting this setup. I'm not really sure what direction I need to be looking in. I am on ATT UVerse and I have to use their horrible 2Wire Modem/Router. But with this router, I have very limited control over my network and my wifi range is too small for my house. But, I have a hardwire ran to my TV which I have hooked into a switch and split the line to my PS3. The problem I'm having, is I don't know how to get the Buffalo DD-WRT setup as a subnetwork so that the PS3 can find my Media Server. Here is a diagram of what I'm trying to accomplish: Blue is ethernet.

    Read the article

  • Can I issue wireless clients with IP address on different subnet?

    - by Beanz
    We have a very standard setup with a Windows Server 2003 domain controller issuing IP addresses using DHCP. This works fine. Internet access is managed via Microsoft ISA Server 2006 Standard. Clients are required to authenticate and this works fine. We now need to provide wireless internet access to visitors for laptops, iPhones etc. We've bought a couple of Netgear access points so I was thinking we might be able to issue wireless clients connected to it with an IP address on a different subnet and then allow non-authenticated Internet access via the ISA Server for that IP range. Does that sound plausible? I'm not even sure if I can issue a different subnet to wireless clients.

    Read the article

  • Setting up Dynamic DNS for Wireless Cameras And Accessing them Remotely

    - by Mike Szp.
    I've been trying to set up two TP LINK wireless N cameras that I bought so that I can see them remotely. I've set it up so that each has it's own ip address (192...105/192...106) and I can access them if I type that into the browser of a local computer The thing is that I don't know how to access them from another remote PC. My current setup is a a each camera connected to the router which then connects to the modem. When I set up the Dynamic DNS, and I access the "webpage" for my IP through a remote computer, it just goes to the configuration page of the modem. I have no idea how to make it go to the router or to the cameras. the router has its own ip range of 192.168.1.x while the modem has 192.168.2.x To access the cameras I type into the web browser: 192.168.1.114:100 on the local computer but I have no idea how to get there through the webpage of my Dynamic DNS remotely.

    Read the article

  • MySQL Query performance - huge difference in time

    - by Damo
    I have a query that is returning in vastly different amounts of time between 2 datasets. For one set (database A) it returns in a few seconds, for the other (database B)....well I haven't waited long enough yet, but over 10 minutes. I have dumped both of these databases to my local machine where I can reproduce the issue running MySQL 5.1.37. Curiously, database B is smaller than database A. A stripped down version of the query that reproduces the problem is: SELECT * FROM po_shipment ps JOIN po_shipment_item psi USING (ship_id) JOIN po_alloc pa ON ps.ship_id = pa.ship_id AND pa.UID_items = psi.UID_items JOIN po_header ph ON pa.hdr_id = ph.hdr_id LEFT JOIN EVENT_TABLE ev0 ON ev0.TABLE_ID1 = ps.ship_id AND ev0.EVENT_TYPE = 'MAS0' LEFT JOIN EVENT_TABLE ev1 ON ev1.TABLE_ID1 = ps.ship_id AND ev1.EVENT_TYPE = 'MAS1' LEFT JOIN EVENT_TABLE ev2 ON ev2.TABLE_ID1 = ps.ship_id AND ev2.EVENT_TYPE = 'MAS2' LEFT JOIN EVENT_TABLE ev3 ON ev3.TABLE_ID1 = ps.ship_id AND ev3.EVENT_TYPE = 'MAS3' LEFT JOIN EVENT_TABLE ev4 ON ev4.TABLE_ID1 = ps.ship_id AND ev4.EVENT_TYPE = 'MAS4' LEFT JOIN EVENT_TABLE ev5 ON ev5.TABLE_ID1 = ps.ship_id AND ev5.EVENT_TYPE = 'MAS5' WHERE ps.eta >= '2010-03-22' GROUP BY ps.ship_id LIMIT 100; The EXPLAIN query plan for the first database (A) that returns in ~2 seconds is: +----+-------------+-------+--------+----------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+------------------------------+------+----------------------------------------------+ | id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra | +----+-------------+-------+--------+----------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+------------------------------+------+----------------------------------------------+ | 1 | SIMPLE | ps | range | PRIMARY,IX_ETA_DATE | IX_ETA_DATE | 4 | NULL | 174 | Using where; Using temporary; Using filesort | | 1 | SIMPLE | ev0 | ref | IX_EVENT_ID_EVENT_TYPE | IX_EVENT_ID_EVENT_TYPE | 36 | UNIVIS_PROD.ps.ship_id,const | 1 | | | 1 | SIMPLE | ev1 | ref | IX_EVENT_ID_EVENT_TYPE | IX_EVENT_ID_EVENT_TYPE | 36 | UNIVIS_PROD.ps.ship_id,const | 1 | | | 1 | SIMPLE | ev2 | ref | IX_EVENT_ID_EVENT_TYPE | IX_EVENT_ID_EVENT_TYPE | 36 | UNIVIS_PROD.ps.ship_id,const | 1 | | | 1 | SIMPLE | ev3 | ref | IX_EVENT_ID_EVENT_TYPE | IX_EVENT_ID_EVENT_TYPE | 36 | UNIVIS_PROD.ps.ship_id,const | 1 | | | 1 | SIMPLE | ev4 | ref | IX_EVENT_ID_EVENT_TYPE | IX_EVENT_ID_EVENT_TYPE | 36 | UNIVIS_PROD.ps.ship_id,const | 1 | | | 1 | SIMPLE | ev5 | ref | IX_EVENT_ID_EVENT_TYPE | IX_EVENT_ID_EVENT_TYPE | 36 | UNIVIS_PROD.ps.ship_id,const | 1 | | | 1 | SIMPLE | psi | ref | PRIMARY,IX_po_shipment_item_po_shipment1,FK_po_shipment_item_po_shipment1 | IX_po_shipment_item_po_shipment1 | 4 | UNIVIS_PROD.ps.ship_id | 1 | | | 1 | SIMPLE | pa | ref | IX_po_alloc_po_shipment_item2,IX_po_alloc_po_details_old,FK_po_alloc_po_shipment1,FK_po_alloc_po_shipment_item1,FK_po_alloc_po_header1 | FK_po_alloc_po_shipment1 | 4 | UNIVIS_PROD.psi.ship_id | 5 | Using where | | 1 | SIMPLE | ph | eq_ref | PRIMARY,IX_HDR_ID | PRIMARY | 4 | UNIVIS_PROD.pa.hdr_id | 1 | | +----+-------------+-------+--------+----------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+------------------------------+------+----------------------------------------------+ The EXPLAIN query plan for the second database (B) that returns in 600 seconds is: +----+-------------+-------+--------+----------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+--------------------------------+------+----------------------------------------------+ | id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra | +----+-------------+-------+--------+----------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+--------------------------------+------+----------------------------------------------+ | 1 | SIMPLE | ps | range | PRIMARY,IX_ETA_DATE | IX_ETA_DATE | 4 | NULL | 38 | Using where; Using temporary; Using filesort | | 1 | SIMPLE | psi | ref | PRIMARY,IX_po_shipment_item_po_shipment1,FK_po_shipment_item_po_shipment1 | IX_po_shipment_item_po_shipment1 | 4 | UNIVIS_DEV01.ps.ship_id | 1 | | | 1 | SIMPLE | ev0 | ref | IX_EVENT_ID_EVENT_TYPE | IX_EVENT_ID_EVENT_TYPE | 36 | UNIVIS_DEV01.psi.ship_id,const | 1 | | | 1 | SIMPLE | ev1 | ref | IX_EVENT_ID_EVENT_TYPE | IX_EVENT_ID_EVENT_TYPE | 36 | UNIVIS_DEV01.psi.ship_id,const | 1 | | | 1 | SIMPLE | ev2 | ref | IX_EVENT_ID_EVENT_TYPE | IX_EVENT_ID_EVENT_TYPE | 36 | UNIVIS_DEV01.ps.ship_id,const | 1 | | | 1 | SIMPLE | ev3 | ref | IX_EVENT_ID_EVENT_TYPE | IX_EVENT_ID_EVENT_TYPE | 36 | UNIVIS_DEV01.psi.ship_id,const | 1 | | | 1 | SIMPLE | ev4 | ref | IX_EVENT_ID_EVENT_TYPE | IX_EVENT_ID_EVENT_TYPE | 36 | UNIVIS_DEV01.psi.ship_id,const | 1 | | | 1 | SIMPLE | ev5 | ref | IX_EVENT_ID_EVENT_TYPE | IX_EVENT_ID_EVENT_TYPE | 36 | UNIVIS_DEV01.ps.ship_id,const | 1 | | | 1 | SIMPLE | pa | ref | IX_po_alloc_po_shipment_item2,IX_po_alloc_po_details_old,FK_po_alloc_po_shipment1,FK_po_alloc_po_shipment_item1,FK_po_alloc_po_header1 | IX_po_alloc_po_shipment_item2 | 4 | UNIVIS_DEV01.ps.ship_id | 4 | Using where | | 1 | SIMPLE | ph | eq_ref | PRIMARY,IX_HDR_ID | PRIMARY | 4 | UNIVIS_DEV01.pa.hdr_id | 1 | | +----+-------------+-------+--------+----------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+--------------------------------+------+----------------------------------------------+ When database B is running I can look at the MySQL Administrator and the state remains at "Copying to tmp table" indefinitely. Database A also has this state but for only a second or so. There are no differences in the table structure, indexes, keys etc between these databases (I have done show create tables and diff'd them). The sizes of the tables are: database A: po_shipment 1776 po_shipment_item 1945 po_alloc 36298 po_header 71642 EVENT_TABLE 1608 database B: po_shipment 463 po_shipment_item 470 po_alloc 3291 po_header 56149 EVENT_TABLE 1089 Some points to note: Removing the WHERE clause makes the query return < 1 sec. Removing the GROUP BY makes the query return < 1 sec. Removing ev5, ev4, ev3 etc makes the query get faster for each one removed. Can anyone suggest how to resolve this issue? What have I missed? Many Thanks.

    Read the article

  • MacPorts, how to run "post-destroot" script

    - by Potatoswatter
    I'm trying to install MacPorts gdb; it seems to be poorly supported… Running "port install" installs it to /opt/local/libexec/gnubin/gdb, but the intent doesn't seem to be to add that to $PATH. The portfile doesn't define any parameters for port select which is typically used to set a MacPorts installation to handle default Unix commands. But it does include these lines: foreach binary [glob -tails -directory ${destroot}${prefix}/bin g*] { ln -s ${prefix}/bin/${binary} ${destroot}${prefix}/libexec/gnubin/[string range $binary 1 end] } This is buried under an action labeled post-destroot. destroot is a MacPorts command but post-destroot is not. The script is apparently not run by port install or port activate, or if it's failing it's doing so silently. Is there a better approach than creating the links manually?

    Read the article

  • SQL Server Reporting Services - Fast TimeDataRetrieval - Long TimeProcessing

    - by user197529
    An application that I support has recently begun experiencing extended periods of time required to execute a report in SQL Server Reporting Services. The reports that are being executed are not terribly complex. There are multiple stored procedures (between 5 and 8) which return anywhere from a handful to 8000 records total. Reports are generally from 2 to 100 pages. One can argue (and I have) the benefit of a 100 page report, but the client is footing the bill. At any rate, the problem is that even the reports with 500 records (11 pages) being returned takes 5 minutes to return to the browser. In the execution log the TimeDataRetrieval is 60 seconds, but the TimeProcessing is 235 seconds. It seems bizarre to me that my query runs so quickly, but it takes Reporting Services so long to process the data. Any suggestions are greatly appreciated. Kind Regards, Bernie

    Read the article

  • What are the implications of expanding an internal subnet mask?

    - by Philip
    Our network is currently working on a 192.168.0.x subnet, all controlled through DHCP, except for the few main servers who have hard-configured IP address settings. What would I kill if I changed the DHCP-published subnet mask from 255.255.255.0 to 255.255.0.0? The reason for doing this is not because we have a huge sudden influx of machines, but because I'd like to start partitioning specific devices into specific IP ranges (to be neat and tidy). For what its worth, I don' plan on changing the allocated DHCP address range, but rather want to move some of the reserved and excluded DHCP addresses out of the address pool. e.g. printers will be 192.168.2.x I will obviously need to change the subnet mask manually on my manually configured devices.

    Read the article

  • how to relaod jqgrid in asp.net mvc when i change dropdownlist

    - by sandeep
    what is wrong in this code? when i change drop down list,the grid takes old value of ddl only, not taken newely selected values why? <%-- $(function() { $("#StateId").change(function() { $('#TheForm').submit(); }); }); $(function() { $("#CityId").change(function() { $('#TheForm').submit(); }); }); $(function() { $("#HospitalName").change(function() { $('#TheForm').submit(); }); }); --% var gridimgpath = '/scripts/themes/coffee/images'; var gridDataUrl = '/Claim/DynamicGridData/'; jQuery(document).ready(function() { // $("#btnSearch").click(function() { var StateId = document.getElementById('StateId').value; var CityId = document.getElementById('CityId').value; var HName = document.getElementById('HospitalName').value; // alert(CityId); // alert(StateId); // alert(HName); if (StateId 0 && CityId == '' && HName == '') { CityId = 0; HName = 'Default'.toString(); // alert("elseif0" + HName.toString()); } else if (CityId 0 && StateId == '') { // alert("elseif1"); alert("Please Select State..") } else if (CityId 0 && StateId 0 && HName == '') { // alert("elseif2"); alert(CityId); alert(StateId); HName = "Default"; } else { // alert("else"); StateId = 0; CityId = 0; HName = "Default"; } jQuery("#list").jqGrid({ url: gridDataUrl + '?StateId=' + StateId + '&CityId=' + CityId + '&hospname=' + HName, datatype: 'json', mtype: 'GET', colNames: ['Id', 'HospitalName', 'Address', 'City', 'District', 'FaxNumber', 'PhoneNumber'], colModel: [{ name: 'HospitalId', index: 'HospitalId', width: 40, align: 'left' }, { name: 'HospitalName', index: 'HospitalName', width: 40, align: 'left' }, { name: 'Address1', Address: 'Address1', width: 300 }, { name: 'CityName', index: 'CityName', width: 100 }, { name: 'DistName', index: 'DistName', width: 100 }, { name: 'FaxNo', index: 'FaxNo', width: 100 }, { name: 'ContactNo1', index: 'PhoneNumber', width: 100 } ], pager: jQuery('#pager'), rowNum: 10, rowList: [5, 10, 20, 50], // sortname: 'Id,', sortname: '1', sortorder: "asc", viewrecords: true, //multiselect: true, //multikey: "ctrlKey", // imgpath: '/scripts/themes/coffee/images', imgpath: gridimgpath, caption: 'Hospital Search', width: 700, height: 250 }); $(function() { // $("#btnSearch").click(function() { $('#CityId').change(function() { alert("kjasd"); // Set the vars whenever the date range changes and then filter the results StateId = document.getElementById('StateId').value; CityId = document.getElementById('CityId').value; HName = 'default'; setGridUrl(); }); // Set the date range textbox values $('#StateId').val(StateId.toString()); $('#CityId').val(CityId.toString()); // Set the grid json url to get the data to display setGridUrl(); }); function setGridUrl() { alert(StateId); alert(CityId); alert("hi"); var newGridDataUrl = gridDataUrl + '?StateId=' + StateId + '&CityId=' + CityId + '&hospname=' + HName; jQuery('#list').jqGrid('setGridParam', { url: newGridDataUrl }).trigger("reloadGrid"); } // }); }); <%--<%using (Html.BeginForm("HospitalSearch", "Claim", FormMethod.Post, new { id = "TheForm" })) --% Hospital Search   State :   City :   Hospital Name :       <div id="jqGridContainer"> <table id="list" class="scroll" cellpadding="0" cellspacing="0"></table> </div>

    Read the article

  • Linux software to maintain old/backup versions of directory tree

    - by Bittrance
    I am replacing an old Linux file server serving NFS and CIFS. For the new server (still serving CIFS and NFS), I would like to have software that automatically and efficiently maintains old revisions of files in parallel trees, so that they can be accessed by users without special tools. I am looking for software that is akin to Time Machine or Flyback, but works well on a server. The dataset is some 10000 files weighing maybe 60 GB. Changes are relatively few, usually less than 100 files changes daily. Using LVM snapshots will not cut it, as the old revisions must reside on a separate set of disks from the live data. Edit: To clarify: keeping old revisions is non-vital addition to the solution, so any suggestion will have to stay in the range of some hundred euros.

    Read the article

  • Thinking of Powerline adapters for networking

    - by Henderson McCoy
    will powerline work as good as wifi I am now in a large house (renting a room) and the router is in the front of the house and I am in the back. Don't know if wifi will have the range and stability so I am looking at all the alternatives from running a long cable to powerline. looks like a powerline adapter set will cost about $50 http://www.bestbuy.com/site/Actiontec---500-Mbps-Powerline-Home-Theater-Network-Adapter-Kit/5215483.p?id=1218625358741&skuId=5215483 has anyone use these or other powerline adapters? how is powerlines speed? Are they more stable then wifi? powerline cost is good i am just currious about the performance

    Read the article

  • Setting default version on Azure Blob Storage?

    - by Erik
    What is the easiest way, without having to create your own utility, to set the default service version to the latest in Azure Blob Storage ? http://msdn.microsoft.com/en-us/library/windowsazure/dd894041 There is basically nothing to be set in the Azure portal and I am having a difficult time finding working utilities to use for Azure. For some reason Azure is defaulting to the oldest version which does not send things like the http range header for example. Any utility that can do this ? Thank you.

    Read the article

  • One admin for multiple sites

    - by valya
    I have two sites with different SITE_IDs, but I want to have only one admin interface for both sites. I have a model, which is just an extended FlatPage: # models.py class SFlatPage(FlatPage): currobjects = CurrentSiteManager('sites') galleries = models.ManyToManyField(Gallery) # etc # admin.py class SFlatPageAdmin(FlatPageAdmin): fieldsets = None admin.site.register(SFlatPage, SFlatPageAdmin) admin.site.unregister(FlatPage) I don't know why, but there are only pages for current site in admin interface. On http://site1.com/admin/ I see flatpages for site1, on http://site2.com/admin/ I see flatpages for site2. But I want to see all pages in http://site1.com/admin/ interface! What am I doing wrong?

    Read the article

  • Puppet's automatically generated certificates failing

    - by gparent
    I am running a default configuration of Puppet on Debian Squeeze 6.0.4. The server's FQDN is master.example.com. The client's FQDN is client.example.com. I am able to contact the puppet master and send a CSR. I sign it using puppetca -sa but the client will still not connect. Date of both machines is within 2 seconds of Tue Apr 3 20:59:00 UTC 2012 as I wrote this sentence. This is what appears in /var/log/syslog: Apr 3 17:03:52 localhost puppet-agent[18653]: Reopening log files Apr 3 17:03:52 localhost puppet-agent[18653]: Starting Puppet client version 2.6.2 Apr 3 17:03:53 localhost puppet-agent[18653]: Could not retrieve catalog from remote server: SSL_connect returned=1 errno=0 state=SSLv3 read server certificate B: certificate verify failed Apr 3 17:03:53 localhost puppet-agent[18653]: Using cached catalog Apr 3 17:03:53 localhost puppet-agent[18653]: Could not retrieve catalog; skipping run Here is some interesting output: OpenSSL client test: client:~# openssl s_client -host master.example.com -port 8140 -cert /var/lib/puppet/ssl/certs/client.example.com.pem -key /var/lib/puppet/ssl/private_keys/client.example.com.pem -CAfile /var/lib/puppet/ssl/certs/ca.pem CONNECTED(00000003) depth=1 /CN=Puppet CA: master.example.com verify return:1 depth=0 /CN=master.example.com verify error:num=7:certificate signature failure verify return:1 depth=0 /CN=master.example.com verify return:1 18509:error:1409441B:SSL routines:SSL3_READ_BYTES:tlsv1 alert decrypt error:s3_pkt.c:1102:SSL alert number 51 18509:error:140790E5:SSL routines:SSL23_WRITE:ssl handshake failure:s23_lib.c:188: client:~# master's certificate: root@master:/etc/puppet# openssl x509 -text -noout -in /etc/puppet/ssl/certs/master.example.com.pem Certificate: Data: Version: 3 (0x2) Serial Number: 2 (0x2) Signature Algorithm: sha1WithRSAEncryption Issuer: CN=Puppet CA: master.example.com Validity Not Before: Apr 2 20:01:28 2012 GMT Not After : Apr 2 20:01:28 2017 GMT Subject: CN=master.example.com Subject Public Key Info: Public Key Algorithm: rsaEncryption RSA Public Key: (1024 bit) Modulus (1024 bit): 00:a9:c1:f9:4c:cd:0f:68:84:7b:f4:93:16:20:44: 7a:2b:05:8e:57:31:05:8e:9c:c8:08:68:73:71:39: c1:86:6a:59:93:6e:53:aa:43:11:83:5b:2d:8c:7d: 54:05:65:c1:e1:0e:94:4a:f0:86:58:c3:3d:4f:f3: 7d:bd:8e:29:58:a6:36:f4:3e:b2:61:ec:53:b5:38: 8e:84:ac:5f:a3:e3:8c:39:bd:cf:4f:3c:ff:a9:65: 09:66:3c:ba:10:14:69:d5:07:57:06:28:02:37:be: 03:82:fb:90:8b:7d:b3:a5:33:7b:9b:3a:42:51:12: b3:ac:dd:d5:58:69:a9:8a:ed Exponent: 65537 (0x10001) X509v3 extensions: X509v3 Basic Constraints: critical CA:FALSE Netscape Comment: Puppet Ruby/OpenSSL Internal Certificate X509v3 Key Usage: critical Digital Signature, Key Encipherment X509v3 Subject Key Identifier: 8C:2F:14:84:B6:A1:B5:0C:11:52:36:AB:E5:3F:F2:B9:B3:25:F3:1C X509v3 Extended Key Usage: critical TLS Web Server Authentication, TLS Web Client Authentication Signature Algorithm: sha1WithRSAEncryption 7b:2c:4f:c2:76:38:ab:03:7f:c6:54:d9:78:1d:ab:6c:45:ab: 47:02:c7:fd:45:4e:ab:b5:b6:d9:a7:df:44:72:55:0c:a5:d0: 86:58:14:ae:5f:6f:ea:87:4d:78:e4:39:4d:20:7e:3d:6d:e9: e2:5e:d7:c9:3c:27:43:a4:29:44:85:a1:63:df:2f:55:a9:6a: 72:46:d8:fb:c7:cc:ca:43:e7:e1:2c:fe:55:2a:0d:17:76:d4: e5:49:8b:85:9f:fa:0e:f6:cc:e8:28:3e:8b:47:b0:e1:02:f0: 3d:73:3e:99:65:3b:91:32:c5:ce:e4:86:21:b2:e0:b4:15:b5: 22:63 root@master:/etc/puppet# CA's certificate: root@master:/etc/puppet# openssl x509 -text -noout -in /etc/puppet/ssl/certs/ca.pem Certificate: Data: Version: 3 (0x2) Serial Number: 1 (0x1) Signature Algorithm: sha1WithRSAEncryption Issuer: CN=Puppet CA: master.example.com Validity Not Before: Apr 2 20:01:05 2012 GMT Not After : Apr 2 20:01:05 2017 GMT Subject: CN=Puppet CA: master.example.com Subject Public Key Info: Public Key Algorithm: rsaEncryption RSA Public Key: (1024 bit) Modulus (1024 bit): 00:b5:2c:3e:26:a3:ae:43:b8:ed:1e:ef:4d:a1:1e: 82:77:78:c2:98:3f:e2:e0:05:57:f0:8d:80:09:36: 62:be:6c:1a:21:43:59:1d:e9:b9:4d:e0:9c:fa:09: aa:12:a1:82:58:fc:47:31:ed:ad:ad:73:01:26:97: ef:d2:d6:41:6b:85:3b:af:70:00:b9:63:e9:1b:c3: ce:57:6d:95:0e:a6:d2:64:bd:1f:2c:1f:5c:26:8e: 02:fd:d3:28:9e:e9:8f:bc:46:bb:dd:25:db:39:57: 81:ed:e5:c8:1f:3d:ca:39:cf:e7:f3:63:75:f6:15: 1f:d4:71:56:ed:84:50:fb:5d Exponent: 65537 (0x10001) X509v3 extensions: X509v3 Basic Constraints: critical CA:TRUE Netscape Comment: Puppet Ruby/OpenSSL Internal Certificate X509v3 Key Usage: critical Certificate Sign, CRL Sign X509v3 Subject Key Identifier: 8C:2F:14:84:B6:A1:B5:0C:11:52:36:AB:E5:3F:F2:B9:B3:25:F3:1C Signature Algorithm: sha1WithRSAEncryption 1d:cd:c6:65:32:42:a5:01:62:46:87:10:da:74:7e:8b:c8:c9: 86:32:9e:c2:2e:c1:fd:00:79:f0:ef:d8:73:dd:7e:1b:1a:3f: cc:64:da:a3:38:ad:49:4e:c8:4d:e3:09:ba:bc:66:f2:6f:63: 9a:48:19:2d:27:5b:1d:2a:69:bf:4f:f4:e0:67:5e:66:84:30: e5:85:f4:49:6e:d0:92:ae:66:77:50:cf:45:c0:29:b2:64:87: 12:09:d3:10:4d:91:b6:f3:63:c4:26:b3:fa:94:2b:96:18:1f: 9b:a9:53:74:de:9c:73:a4:3a:8d:bf:fa:9c:c0:42:9d:78:49: 4d:70 root@master:/etc/puppet# Client's certificate: client:~# openssl x509 -text -noout -in /var/lib/puppet/ssl/certs/client.example.com.pem Certificate: Data: Version: 3 (0x2) Serial Number: 3 (0x3) Signature Algorithm: sha1WithRSAEncryption Issuer: CN=Puppet CA: master.example.com Validity Not Before: Apr 2 20:01:36 2012 GMT Not After : Apr 2 20:01:36 2017 GMT Subject: CN=client.example.com Subject Public Key Info: Public Key Algorithm: rsaEncryption RSA Public Key: (1024 bit) Modulus (1024 bit): 00:ae:88:6d:9b:e3:b1:fc:47:07:d6:bf:ea:53:d1: 14:14:9b:35:e6:70:43:e0:58:35:76:ac:c5:9d:86: 02:fd:77:28:fc:93:34:65:9d:dd:0b:ea:21:14:4d: 8a:95:2e:28:c9:a5:8d:a2:2c:0e:1c:a0:4c:fa:03: e5:aa:d3:97:98:05:59:3c:82:a9:7c:0e:e9:df:fd: 48:81:dc:33:dc:88:e9:09:e4:19:d6:e4:7b:92:33: 31:73:e4:f2:9c:42:75:b2:e1:9f:d9:49:8c:a7:eb: fa:7d:cb:62:22:90:1c:37:3a:40:95:a7:a0:3b:ad: 8e:12:7c:6e:ad:04:94:ed:47 Exponent: 65537 (0x10001) X509v3 extensions: X509v3 Basic Constraints: critical CA:FALSE Netscape Comment: Puppet Ruby/OpenSSL Internal Certificate X509v3 Key Usage: critical Digital Signature, Key Encipherment X509v3 Subject Key Identifier: 8C:2F:14:84:B6:A1:B5:0C:11:52:36:AB:E5:3F:F2:B9:B3:25:F3:1C X509v3 Extended Key Usage: critical TLS Web Server Authentication, TLS Web Client Authentication Signature Algorithm: sha1WithRSAEncryption 33:1f:ec:3c:91:5a:eb:c6:03:5f:a1:58:60:c3:41:ed:1f:fe: cb:b2:40:11:63:4d:ba:18:8a:8b:62:ba:ab:61:f5:a0:6c:0e: 8a:20:56:7b:10:a1:f9:1d:51:49:af:70:3a:05:f9:27:4a:25: d4:e6:88:26:f7:26:e0:20:30:2a:20:1d:c4:d3:26:f1:99:cf: 47:2e:73:90:bd:9c:88:bf:67:9e:dd:7c:0e:3a:86:6b:0b:8d: 39:0f:db:66:c0:b6:20:c3:34:84:0e:d8:3b:fc:1c:a8:6c:6c: b1:19:76:65:e6:22:3c:bf:ff:1c:74:bb:62:a0:46:02:95:fa: 83:41 client:~#

    Read the article

  • Excel Help: Userforms

    - by B-Ballerl
    I have developed a macro that does a whole bunch of things for me based on a few things. (Importing files). The file names are dated dd_mm_yyyy and right now I enter them into a sheet where the macro can call the information. Not really wanting this I designed a userform where the user could enter the "dd", "mm", "yyyy" and how many consecutive days of files there were. Ex. Say 28_06_2011.txt 29_06_2011.txt there would be one consecutive day. I want to be able to call the information entered in the user-form (day, month, year, and consecutive days) to use in the macro and have been unsuccessful because I don't know how to call that information. Is it similar to referring to a range in a worksheet? Thanks in Advance for any Help.

    Read the article

  • correct way to prevent SPF failures

    - by Sean Kimball
    I have a website on a hosted server whose mail users are using their ISPs SMTP to send mail. I have set their SPF record to look like this: v=spf1 mx a:comcast.net ip4:216.70.103.0/24 ip4:216.70.101.0/24 ip4:76.96.53.0/24 -all the SMTP host is comcast.net, 76.96.53.0/24 is the ip range they get assigned from. ip4:216.70.103.0/24 ip4:216.70.101.0/24 are the two possible SMTP ranges they could get IF they used their hosting account mail servers [media temple] They are still getting SPF errors, any idea why?

    Read the article

  • How can I prompt user for additional permissions on page load?

    - by Rew
    Note: The application I'm building is a website type Facebook Application that uses Facebook Connect. I can prompt the user with a request for Extended Permissions using the following FBML code: <fb:prompt-permission perms="read_stream,publish_stream"> Grant permission for status updates </fb:prompt-permission> Taken from here. This creates a link on the page that must be clicked by the user in order to trigger the prompt to display. My question is: Is it possible to display this prompt automatically, when the page loads, without requiring the user to click on the link? Also, in reply to the answer below, I'd like to avoid displaying the link. Would prefer a neat way to do this, if that fails the least dirty method will do :-)

    Read the article

  • Which mediacenter appliance

    - by Guillaume
    I'm looking to buy a mediacenter appliance, but I am a bit lost by all the offers on the market. Here are the features I am looking for : at least 500Go HDD capable of playing HD (1080p) supporting a good range of video and audio codecs support for subtitles including a DVB-T input (coax) and tuner composite video input 5.1 (or 7.1) analog audio output LAN (100M or Gigabit) fileserver (SMB or NFS) remote control possibly digital output (S/PDIF, TOSLINK, or other) no need for integrated DVD player no need for wifi The Ultio Pro looks nice (http://www.thinkgeek.com/electronics/home-entertainment/d3fe). It just lacks a DVB-T input which is a show killer for me. Thanks for your suggestions.

    Read the article

  • What is the best memory supported by Asus Rampage Formula (1) motherboard

    - by James
    I'm rebuilding a PC from old components I have and I want to know what are the best ram sticks for the Asus Rampage Formula (The first motherboard in the Rampage Formula range) According to the Asus website (HERE) the maximum supported memory is 8GB but when I search through the memory compatibility list there's nothing above 2046mb and the motherboard only supports single or dual-channel memory. Can anyone point me in the direction of a more comprehensive list of compatible memory? Or does anyone know where I can ask about memory compatibility? Note: As far as I know, this model motherboard is no longer in production.

    Read the article

  • Computer crashes and won't start, LED indicator blinks slowly

    - by hexacyanide
    I have a MSI K8NGM2-FID motherboard coupled with a AMD Athlon 3700+ and an Antec TPII-550 power supply. The computer crashes at random times in operation, sometimes taking a while to crash, and sometimes crashing right after Windows XP shows the desktop after boot. The CPU temperature is always in the safe range, usually 30-33 degrees Celsius, and swapping the RAM has not done anything, so the crashes aren't cause by the RAM. What could be causing this? If the motherboard were fried, would the computer even boot at all? What causes inconsistent crashing of the computer? After the computer crashes, the LED power indicator blinks about once a second, and the computer does not respond to the power button. This behavior continues until the computer's power supply is completely removed.

    Read the article

  • Setting up a vpn and IIS IP address restrictions

    - by carpat
    I'm trying to get a VPN set up with internal access only sites. I have set up a VPN on a windows server (single VPS server), and I can connect from a remote computer and I get an IP assigned correctly (from 192.168.1.1 - 255) Next I configured IIS (running on the same machine) IP Address and Domain Restrictions to only allow only IP address range 192.168.1.0 with subnet mask 255.255.255.0 When I connect to the VPN with "Use Default Gateway on Remote Network" (so that requests must go through the vpn), I get a 403 from the internal sites. What did I miss?

    Read the article

  • Needed list of special characters classification with respective characters

    - by pravin
    I am working on one web application , It's related to machine translation support i.e. which takes source text for translation and translated in to user specified language Currently it's in unit testing phase. Here, i want to check that, whether my machine translation feature is fully working for all the special characters. Because of different test cases I stuck at one point where i need all the special characters with classification. I needed all the special characters listing with classification. e.g. 1st : class name : Punctuation Characters : !?,"| etc test cases : segment1? segment2! segment3. 2nd : Class name : HTML entities characters : all the characters which belong under this class test cases : respective test cases 3rd : Class name : Extended ASCII characters :all the characters which belong under this class test cases : respective test cases Please folks provide this, if anyone has any idea or links so that i can make product perfect Thanks a lot

    Read the article

  • Overflow exception while performing parallel factorization using the .NET Task Parallel Library (TPL

    - by Aviad P.
    Hello, I'm trying to write a not so smart factorization program and trying to do it in parallel using TPL. However, after about 15 minutes of running on a core 2 duo machine, I am getting an aggregate exception with an overflow exception inside it. All the entries in the stack trace are part of the .NET framework, the overflow does not come from my code. Any help would be appreciated in figuring out why this happens. Here's the commented code, hopefully it's simple enough to understand: class Program { static List<Tuple<BigInteger, int>> factors = new List<Tuple<BigInteger, int>>(); static void Main(string[] args) { BigInteger theNumber = BigInteger.Parse( "653872562986528347561038675107510176501827650178351386656875178" + "568165317809518359617865178659815012571026531984659218451608845" + "719856107834513527"); Stopwatch sw = new Stopwatch(); bool isComposite = false; sw.Start(); do { /* Print out the number we are currently working on. */ Console.WriteLine(theNumber); /* Find a factor, stop when at least one is found (using the Any operator). */ isComposite = Range(theNumber) .AsParallel() .Any(x => CheckAndStoreFactor(theNumber, x)); /* Of the factors found, take the one with the lowest base. */ var factor = factors.OrderBy(x => x.Item1).First(); Console.WriteLine(factor); /* Divide the number by the factor. */ theNumber = BigInteger.Divide( theNumber, BigInteger.Pow(factor.Item1, factor.Item2)); /* Clear the discovered factors cache, and keep looking. */ factors.Clear(); } while (isComposite); sw.Stop(); Console.WriteLine(isComposite + " " + sw.Elapsed); } static IEnumerable<BigInteger> Range(BigInteger squareOfTarget) { BigInteger two = BigInteger.Parse("2"); BigInteger element = BigInteger.Parse("3"); while (element * element < squareOfTarget) { yield return element; element = BigInteger.Add(element, two); } } static bool CheckAndStoreFactor(BigInteger candidate, BigInteger factor) { BigInteger remainder, dividend = candidate; int exponent = 0; do { dividend = BigInteger.DivRem(dividend, factor, out remainder); if (remainder.IsZero) { exponent++; } } while (remainder.IsZero); if (exponent > 0) { lock (factors) { factors.Add(Tuple.Create(factor, exponent)); } } return exponent > 0; } } Here's the exception thrown: Unhandled Exception: System.AggregateException: One or more errors occurred. --- > System.OverflowException: Arithmetic operation resulted in an overflow. at System.Linq.Parallel.PartitionedDataSource`1.ContiguousChunkLazyEnumerator.MoveNext(T& currentElement, Int32& currentKey) at System.Linq.Parallel.AnyAllSearchOperator`1.AnyAllSearchOperatorEnumerator`1.MoveNext(Boolean& currentElement, Int32& currentKey) at System.Linq.Parallel.StopAndGoSpoolingTask`2.SpoolingWork() at System.Linq.Parallel.SpoolingTaskBase.Work() at System.Linq.Parallel.QueryTask.BaseWork(Object unused) at System.Linq.Parallel.QueryTask.<.cctor>b__0(Object o) at System.Threading.Tasks.Task.InnerInvoke() at System.Threading.Tasks.Task.Execute() --- End of inner exception stack trace --- at System.Linq.Parallel.QueryTaskGroupState.QueryEnd(Boolean userInitiatedDispose) at System.Linq.Parallel.SpoolingTask.SpoolStopAndGo[TInputOutput,TIgnoreKey](QueryTaskGroupState groupState, PartitionedStream`2 partitions, SynchronousChannel`1[] channels, TaskScheduler taskScheduler) at System.Linq.Parallel.DefaultMergeHelper`2.System.Linq.Parallel.IMergeHelper<TInputOutput>.Execute() at System.Linq.Parallel.MergeExecutor`1.Execute[TKey](PartitionedStream`2 partitions, Boolean ignoreOutput, ParallelMergeOptions options, TaskScheduler taskScheduler, Boolean isOrdered, CancellationState cancellationState, Int32 queryId) at System.Linq.Parallel.PartitionedStreamMerger`1.Receive[TKey](PartitionedStream`2 partitionedStream) at System.Linq.Parallel.AnyAllSearchOperator`1.WrapPartitionedStream[TKey](PartitionedStream`2 inputStream, IPartitionedStreamRecipient`1 recipient, BooleanpreferStriping, QuerySettings settings) at System.Linq.Parallel.UnaryQueryOperator`2.UnaryQueryOperatorResults.ChildResultsRecipient.Receive[TKey](PartitionedStream`2 inputStream) at System.Linq.Parallel.ScanQueryOperator`1.ScanEnumerableQueryOperatorResults.GivePartitionedStream(IPartitionedStreamRecipient`1 recipient) at System.Linq.Parallel.UnaryQueryOperator`2.UnaryQueryOperatorResults.GivePartitionedStream(IPartitionedStreamRecipient`1 recipient) at System.Linq.Parallel.QueryOperator`1.GetOpenedEnumerator(Nullable`1 mergeOptions, Boolean suppressOrder, Boolean forEffect, QuerySettings querySettings) at System.Linq.Parallel.QueryOpeningEnumerator`1.OpenQuery() at System.Linq.Parallel.QueryOpeningEnumerator`1.MoveNext() at System.Linq.Parallel.AnyAllSearchOperator`1.Aggregate() at System.Linq.ParallelEnumerable.Any[TSource](ParallelQuery`1 source, Func`2 predicate) at PFact.Program.Main(String[] args) in d:\myprojects\PFact\PFact\Program.cs:line 34 Any help would be appreciated. Thanks!

    Read the article

< Previous Page | 142 143 144 145 146 147 148 149 150 151 152 153  | Next Page >