Search Results

Search found 19131 results on 766 pages for 'power options'.

Page 538/766 | < Previous Page | 534 535 536 537 538 539 540 541 542 543 544 545  | Next Page >

  • SQLAuthority News – Technical Review of Learning at Koenig Solutions

    - by pinaldave
    Yesterday I finished my 3 days fast track in person learning of course End to End SQL Server Business Intelligence at Koenig Solutions. You can read my previous article over here regarding why am I learning SQL Server. Yesterday I blogged about my experience of arriving to Training Center and my induction with the center. The Training Days I had enrolled for three days training so my routine each of the three days was very much same. However, the content every day was different as I was learning something new every day. Let me describe a few of the interesting details of my daily routine. A Single Student Batch The best part of my training was that in my training batch, I am single student. Koenig is known to smaller batches and often they have single student batches as well. I was very much delighted to know that I will have dedicated access and attention from my trainer in my batch as I will be single student in my batch. In most of the labs I have observed there are no more than 4 students at any time. Prakash and Pinal 7:30 AM Breakfast Talk We all students gather at 7:30 in breakfast area. The best time of the day. I was the only Indian student in the group. The other students were from USA, Canada, Nigeria, Bhutan, Tanzania, and a few others from other countries. I immediately become the source of information and reference manual. Though the distance between Delhi and Bangalore is 2000+ KM I was considered as a local guy. 8:30 AMHeading to Training Center Every day without fail at 8:30 the van started from our accommodation to the training center. As mentioned in an earlier blog post the distance is about 5 minutes and we were able to reach at the location before 8:45. This gave us some time settle in before our class starts at 9:00 AM. 9:00 AM Order Lunch Food Well it may sound funny that we just had breakfast 30 minutes but the first thing everybody has to do is to order lunch as soon as the class starts. There is an online training portal to order food for the day. Everybody has to place their order early during the day so the food arrives on time during lunch time. Everybody can order whatever they want to order using an online ordering system. The options are plenty and everybody can order what they like. 9:05 AM Learning Starts After deciding the lunch we started the learning. I was very fortunate to have a very experienced trainer - Prakash Chheatry. Though I have never met him before I have heard a lot about Prakash. He is known as the top most SQL Server Trainer in India. His student list contains some of the very well known SQL Server Experts of the world and few of SQL Server “best seller” book authors. Learning continues till 1:00 PM with one tea-coffee break in between. 1:00 PM Lunch The lunch time is again the fun time. We all students get together in the afternoon and tell the stories of the world. Indeed the best part of the day beside learning new stuff. 4:55 PM Ready to Return We stop at 4:55 as at precisely 5:00 PM the van stops by the institute which takes us back to our accommodation. Trust me seriously long long day always but the amount of the learning is the win of the day. 7:30 PM Dinner Time After coming back to the accommodation I study till 7:30 and then rush for dinner. Dinner is world cuisine and deserts are really delicious. After dinner every day I have written a blog and retired early as the next day is always going to be busier than the present day. What did I learn As I mentioned earlier I know SQL Server fairly well. I had expressed the same in my conversation as well. This is the reason I was assigned a fairly senior trainer and we learned everything quite quickly. As I know quite a few things we went pretty fast in many topics. There were a few things, I wanted to learn in detail as well practice on the labs. We slowed down where we wanted and rush through the concepts where I was very comfortable. Here is the list of the things which we covered in action pack three days. Introduction to Business Intelligence (Intro) SQL Server Analysis Service (Theory and Lab) SQL Server Integration Service  (Theory and Lab) SQL Server Reporting Service  (Theory and Lab) SQL Server PowerPivot (Lab) UDM (Theory) SharePoint Concepts (Theory) Power View (Demo) Business Intelligence and Security (Discussion) Well, I was delighted that I was able to refresh lots of concepts during these three days. Thanks to my trainer and my friend who helped me to have a good learning experience. I believe all the learning  will help me in my growth and future career. With this I end my this experience. I am planning to have another online learning experience later this month. I will blog about my experience as I begin it. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Training, T SQL, Technology

    Read the article

  • Synchronize model in MySQL Workbench

    - by Álvaro G. Vicario
    After reading the documentation for MySQL Workbench I got the impression that it's possible to alter a database in the server (e.g. add a new column) and later incorporate the DDL changes into your EER diagram. At least, it has a Synchronize Model option in the Database menu. I found it a nice feature because I could use a graphic modelling tool without becoming its prisoner. In practice, when I run such tool I'm offered these options: Model Update Source ================ ====== ====== my_database_name --> ! N/A my_table_name --> ! N/A N/A --> ! my_database_name N/A --> ! my_table_name I can't really understand it, but leaving it as is I basically get: DROP SCHEMA my_database_name CREATE SCHEMA my_database_name CREATE TABLE my_table_name This is dump of the model that overwrites all remote changes in my_table_name. Am I misunderstanding the feature?

    Read the article

  • two RewriteRules with www redirect

    - by Eric Di Bari
    I have a multiple language website, that uses subdirectories from the root ('/en' for english and '/es' for spanish) for each specific language. Each redirect appends a get variable to the URL, and hides it using a 'P' flag for proxy. My current htaccess file for the spanish subfolder is: Options +FollowSymlinks RewriteEngine on RewriteOptions MaxRedirects=10 RewriteBase / RewriteRule ^(.*)\.html$ $1.php RewriteRule ^(.*)$ http://www.domain.com/$1?l=es [P,R=301,L] The problem is that I also want to append the 'www' to the domain if it was left off. The proxy redirect does not show the 'www.' Is it possible to place a rewriterule before that final one that will append the www, and then still process the final one?

    Read the article

  • Help! cant login to my forum after .htaccess changes

    - by MrRioku
    I'm running a phpbb forum and I installed an SEO friendly URL Mod. After I installed it I wasn't able to login as the admin nor any other users... Well I've been messing around with this problem for a bit and I did find out that it was the .htaccess file that was causing me not to be able to login. I found out if I comment the canonical part out in the .htaccess file, my website allows me to login and see the working mod just fine but I get "http://phone7forum.com/" instead of what I want which is "http://www.phone7forum.com/ : enter code here # HERE IS A GOOD PLACE TO FORCE CANONICAL DOMAIN # RewriteCond %{HTTP_HOST} !^www\.phone7forum\.com$ [NC] # RewriteRule ^(.*)$ http://www.phone7forum.com/$1 [QSA,L,R=301] enter code here I definitely want the "http://www." since its the best method and holds the best weight.. But when I un-comment it, it will not allow me to login as the admin nor any user... Is there anything I can do to fix this matter??? I'd like to add, before I found this website I installed the Mod "Canonical URL" found here: http://www.phpbb.com/customise/db/mod/canonical_url/ Possibly this might be part of my issue? Maybe I should go undo everything that Mod told me to do and then try un-commenting the canonical section in the .htaccess file? Here is my complete .htaccess file as of now that produces the "http://phone7forum.com/" and allows me to login: [code] # Lines That should already be in your .htacess <Files "config.php"> Order Allow,Deny Deny from All </Files> <Files "common.php"> Order Allow,Deny Deny from All </Files> # You may need to un-comment the following lines # Options +FollowSymlinks # To make sure that rewritten dir or file (/|.html) will not load dir.php in case it exist # Options -MultiViews # REMEBER YOU ONLY NEED TO STARD MOD REWRITE ONCE RewriteEngine On # Uncomment the statement below if you want to make use of # HTTP authentication and it does not already work. # This could be required if you are for example using PHP via Apache CGI. # RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization},L] # REWRITE BASE RewriteBase / # HERE IS A GOOD PLACE TO FORCE CANONICAL DOMAIN # RewriteCond %{HTTP_HOST} !^www\.phone7forum\.com$ [NC] # RewriteRule ^(.*)$ http://www.phone7forum.com/$1 [QSA,L,R=301] # DO NOT GO FURTHER IF THE REQUESTED FILE / DIR DOES EXISTS RewriteCond %{REQUEST_FILENAME} -f RewriteCond %{REQUEST_FILENAME} -d RewriteRule . - [L] ##################################################### # PHPBB SEO REWRITE RULES ALL MODES ##################################################### # AUTHOR : dcz www.phpbb-seo.com # STARTED : 01/2006 ################################# # FORUMS PAGES ############### # FORUM INDEX REWRITERULE WOULD STAND HERE IF USED. "forum" REQUIRES TO BE SET AS FORUM INDEX # RewriteRule ^forum\.html$ index.php [QSA,L,NC] # FORUM ALL MODES RewriteRule ^(forum|[a-z0-9_-]*-f)([0-9]+)/?(page([0-9]+)\.html)?$ viewforum.php?f=$2&start=$4 [QSA,L,NC] # TOPIC WITH VIRTUAL FOLDER ALL MODES RewriteRule ^(forum|[a-z0-9_-]*-f)([0-9]+)/(topic|[a-z0-9_-]*-t)([0-9]+)(-([0-9]+))?\.html$ viewtopic.php?f=$2&t=$4&start=$6 [QSA,L,NC] # TOPIC WITHOUT FORUM ID & DELIM ALL MODES RewriteRule ^([a-z0-9_-]*)/?(topic|[a-z0-9_-]*-t)([0-9]+)(-([0-9]+))?\.html$ viewtopic.php?forum_uri=$1&t=$3&start=$5 [QSA,L,NC] # PHPBB FILES ALL MODES RewriteRule ^resources/[a-z0-9_-]+/(thumb/)?([0-9]+)$ download/file.php?id=$2&t=$1 [QSA,L,NC] # PROFILES ALL MODES WITH ID RewriteRule ^(member|[a-z0-9_-]*-u)([0-9]+)\.html$ memberlist.php?mode=viewprofile&u=$2 [QSA,L,NC] # USER MESSAGES ALL MODES WITH ID RewriteRule ^(member|[a-z0-9_-]*-u)([0-9]+)-(topics|posts)(-([0-9]+))?\.html$ search.php?author_id=$2&sr=$3&start=$5 [QSA,L,NC] # GROUPS ALL MODES RewriteRule ^(group|[a-z0-9_-]*-g)([0-9]+)(-([0-9]+))?\.html$ memberlist.php?mode=group&g=$2&start=$4 [QSA,L,NC] # POST RewriteRule ^post([0-9]+)\.html$ viewtopic.php?p=$1 [QSA,L,NC] # ACTIVE TOPICS RewriteRule ^active-topics(-([0-9]+))?\.html$ search.php?search_id=active_topics&start=$2&sr=topics [QSA,L,NC] # UNANSWERED TOPICS RewriteRule ^unanswered(-([0-9]+))?\.html$ search.php?search_id=unanswered&start=$2&sr=topics [QSA,L,NC] # NEW POSTS RewriteRule ^newposts(-([0-9]+))?\.html$ search.php?search_id=newposts&start=$2&sr=topics [QSA,L,NC] # UNREAD POSTS RewriteRule ^unreadposts(-([0-9]+))?\.html$ search.php?search_id=unreadposts&start=$2 [QSA,L,NC] # THE TEAM RewriteRule ^the-team\.html$ memberlist.php?mode=leaders [QSA,L,NC] # HERE IS A GOOD PLACE TO ADD OTHER PHPBB RELATED REWRITERULES # FORUM WITHOUT ID & DELIM ALL MODES # THESE THREE LINES MUST BE LOCATED AT THE END OF YOUR HTACCESS TO WORK PROPERLY RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^([a-z0-9_-]+)/?(page([0-9]+)\.html)?$ viewforum.php?forum_uri=$1&start=$3 [QSA,L,NC] # FIX RELATIVE PATHS : FILES RewriteRule ^.+/(style\.php|ucp\.php|mcp\.php|faq\.php|download/file.php)$ $1 [QSA,L,NC,R=301] # FIX RELATIVE PATHS : IMAGES RewriteRule ^.+/(styles/.*|images/.*)/$ $1 [QSA,L,NC,R=301] # END PHPBB PAGES ##################################################### [/code] Does anyone have any clues as to what I need to do? Any help will be greatly appreciated! Thanks, Justin

    Read the article

  • Data Mining Resources

    - by Dejan Sarka
    There are many different types of analyses, each one with its own pros and cons. Relational reports have a predefined structure, and end users cannot change it. They are simple to use for end users. Reports can use real-time data and snapshots of data to show the state of a report at specific points in time. One of the drawbacks is that report authoring is limited to IT pros and advanced users. Any kind of dynamic restructuring is very limited. If real-time data is used for a report, the report has a negative impact on the performance of the source system. Processing of the reports might be slow because the data comes from relational database management systems, which are not optimized for reporting only. If you create a semantic model of your data, your end users can create ad-hoc report structures. However, the development is more complex because a developer is needed to create these semantic models. For OLAP, you typically use specialized database management systems. You get lightning speed of analyses. End users can use rich and thin clients to interactively change the structure of the report. Typically, they do it graphically. However, the development of an OLAP system is many times quite complex. It involves the preparation and maintenance of an enterprise data warehouse and OLAP cubes. In order to exploit the possibility of real-time restructuring of reports, the users must be both active and educated. The data is usually stale, as it is loaded into data warehouses and OLAP cubes with a scheduled process. With data mining, a structure is not selected in advance; it searches for the structure. As a result, data mining can give you the most valuable results because you can discover patterns you did not expect. A data mining model structure is limited only by the attributes that you use to train the model. One of the drawbacks is that a lot of knowledge is needed for a successful data mining project. End users have to understand the results. Subject matter experts and IT professionals need to understand business problem thoroughly. The development might be sometimes even more complex than the development of OLAP cubes. Each type of analysis has its own place in an enterprise system. SQL Server has tools for all kinds of analyses. However, data mining is the most advanced way of analyzing the data; this is the “I” in BI. In order to get the most out of it, you need to learn quite a lot. In this blog post, I am gathering together resources for learning, including forthcoming events. Books Multiple authors: SQL Server MVP Deep Dives – I wrote an introductory data mining chapter there. Erik Veerman, Teo Lachev and Dejan Sarka: MCTS Self-Paced Training Kit (Exam 70-448): Microsoft SQL Server 2008 - Business Intelligence Development and Maintenance – you can find a good overview of a complete BI solution, including data mining, in this book. Jamie MacLennan, ZhaoHui Tang, and Bogdan Crivat: Data Mining with Microsoft SQL Server 2008 – can’t miss this book if you want to mine your data with SQL Server tools. Michael Berry, Gordon Linoff: Mastering Data Mining: The Art and Science of Customer Relationship Management – data mining from both, business and technical perspective. Dorian Pyle: Data Preparation for Data Mining – an in-depth book about data preparation. Thomas and Ronald Wonnacott: Introductory Statistics – if you thought that you could get away without statistics, then you are not serious about data mining. Jiawei Han and Micheline Kamber: Data Mining Concepts and Techniques – in-depth explanation of the most popular data mining algorithms. Michael Berry and Gordon Linoff: Data Mining Techniques – another book that explains data mining algorithms, more fro a business perspective. Paolo Guidici: Applied Data Mining – very mathematical book, only if you enjoy statistics and mathematics in general. Forthcoming presentations I am presenting two data mining related sessions during the PASS Summit in Charlotte, NC: Wednesday, October 16th, 2013 - Fraud Detection: Notes from the Field – I am showing how to use data mining for a specific business problem. The presentation is based on real-life projects. Friday, October 18th: Excel 2013 Advanced Analytics – I am focusing on Excel Data Mining Add-ins, and how to use them together with Power Pivot and other add-ins. This is the most you can get out of Excel. Sinergija 2013, Belgrade, Serbia Tuesday, October 22nd: Excel 2013 Analytics to the Max – another presentation focusing on the most advanced analytics you can get in Excel. SQL Rally Amsterdam, Netherlands Thursday, November 7th: Advanced Analytics in Excel 2013 – and again I am presenting about data mining in Excel. Why three different titles for the same presentation? I don’t know, I guess I forgot the name I proposed every time right after I sent the proposal. Courses Data Mining with SQL Server 2012 – I wrote a 3-day course for SolidQ. If you are interested in this course, which I could also deliver in a shorter seminar way, you can contact your closes SolidQ subsidiary, or, of course, me directly on addresses [email protected] or [email protected]. This course could also complement the existing courseware portfolio of training providers, which are welcome to contact me as well. OK, now you know: no more excuses, start learning data mining, get the most out of your data

    Read the article

  • Help me understand JavaScript events

    - by Anil Namde
    I have a table full of cells and i would like to get on which cell the mouse is. For this i have attached events to all the cells and then i am finding the elements. But i guess there could be a better options. right ? Is it possible that i attach only single event handler on top and still be able to catch all the information. like which cell user is currently on etc. Something like below, <table onMouseOver="monitorFuntion(event)" >...</table>

    Read the article

  • Is it possible to get the client process ID of an application that runs on SQL server?

    - by Andrea.Ko
    Hi all, For my VFP application, i have a program to check currently who is accessing the server (by using sp_who2), also another progam to check who is currently locking which table. But i wish to know which options my users is accessing at the moment. Am thinking if i can write a SP to get the current connected process ID for a specific client, and insert to a table(ActLog) in SQL with the program name pass into this table during users load the program. And delete that particular record when user unload the program. Then from the ActLog, i can know who is currently accessing to which program. At the moment, i wish to know if i able to get the client process ID? rgds/Andrea

    Read the article

  • Jquery - $(this) in in nested loops

    - by Smickie
    Hi, I can't figure out how to do something in Jquery. Let's say I have a form with many select drop-downs and do this... $('#a_form select').each(function(index) { }); When inside this loop I want to loop over each of the options, but I can't figure out how to do this, is it something like this....? $('#a_form select').each(function(index) { $(this + 'option').each(function(index) { //do things }); }); I can't quite get it to work, and advice? Cheers.

    Read the article

  • JSON + 3des encrypt dosen't work

    - by oscurodrago
    i'm trying to pass some JSON data encrypted to my app but seem when i decrypt it my script add some 00 to hexe code making it impossible to be serialized i've tried to pass data uncrypted and crypted and the only difference i found is 00 at the end that is how i read JSON if isn't Encrypted NSLog(@"A) %@", response ); NSMutableArray *json = [NSJSONSerialization JSONObjectWithData:response options:kNilOptions error:&error]; NSLog(@"B) %@", error); that is NSLog Output 2012-03-28 00:43:58.399 VGCollect[5583:f803] A) <5b7b2274 69746c65 223a2252 696c6173 63696174 61206c61 20707269 6d612045 7370616e 73696f6e 65206469 20526167 6e61726f 6b204f64 79737365 79222c22 63617465 676f7279 223a225b 20505356 69746120 5d222c22 696d6755 726c223a 22494d41 4745535c 2f323031 325c2f30 335c2f30 30303038 3536345f 31353078 38302e6a 7067227d 2c7b2274 69746c65 223a2250 68616e74 61737920 53746172 204f6e6c 696e6520 3220616e 63686520 73752069 4f532065 20416e64 726f6964 20222c22 63617465 676f7279 223a225b 20495048 4f4e452c 2050432c 20505356 69746120 5d222c22 696d6755 726c223a 22494d41 4745535c 2f323031 325c2f30 335c2f30 30303038 3437355f 31353078 38302e6a 7067227d 2c7b2274 69746c65 223a2250 72696d69 20536372 65656e73 686f7420 696e2061 6c747261 20726973 6f6c757a 696f6e65 20706572 202e6861 636b5c2f 5c2f7665 72737573 222c2263 61746567 6f727922 3a225b20 50533320 5d222c22 696d6755 726c223a 22494d41 4745535c 2f323031 325c2f30 335c2f30 30303038 3437325f 31353078 38302e6a 7067227d 2c7b2274 69746c65 223a2269 6c206e75 6f766f20 444c4320 64692046 696e616c 2046616e 74617379 20584949 492d3220 636f6e66 65726d61 746f2061 6e636865 20696e20 416d6572 69636122 2c226361 7465676f 7279223a 225b2050 53332c20 58333630 205d222c 22696d67 55726c22 3a22494d 41474553 5c2f3230 31325c2f 30335c2f 30303030 38343439 5f313530 7838302e 6a706722 7d2c7b22 7469746c 65223a22 52656769 73747261 746f2069 6c206d61 72636869 6f205461 6c657320 6f662058 696c6c69 6120696e 20457572 6f706122 2c226361 7465676f 7279223a 225b2050 5333205d 222c2269 6d675572 6c223a22 494d4147 45535c2f 32303132 5c2f3033 5c2f3030 30303834 34385f31 35307838 302e6a70 67227d2c 7b227469 746c6522 3a225376 656c6174 69202e68 61636b5c 2f5c2f56 65727375 732e2065 202e6861 636b5c2f 5c2f5365 6b616920 6e6f204d 756b6f75 206e6922 2c226361 7465676f 7279223a 225b2050 5333205d 222c2269 6d675572 6c223a22 494d4147 45535c2f 32303132 5c2f3033 5c2f3030 30303834 33345f31 35307838 302e6a70 67227d2c 7b227469 746c6522 3a22476f 6e205061 6b752050 616b7520 50616b75 2050616b 75204164 76656e74 75726520 756e206e 756f766f 2067696f 636f2064 69204e61 6d636f20 42616e64 6169222c 22636174 65676f72 79223a22 5b203344 53205d22 2c22696d 6755726c 223a2249 4d414745 535c2f32 3031325c 2f30335c 2f303030 30383432 385f3135 30783830 2e6a7067 227d2c7b 22746974 6c65223a 224e756f 76652069 6d6d6167 696e6920 6520696e 666f2064 69204669 72652045 6d626c65 6d204177 616b656e 696e6722 2c226361 7465676f 7279223a 225b2033 4453205d 222c2269 6d675572 6c223a22 494d4147 45535c2f 32303132 5c2f3033 5c2f3030 30303834 31365f31 35307838 302e6a70 67227d2c 7b227469 746c6522 3a225072 696d6520 696d6d61 67696e69 20646569 206e756f 76692044 4c432064 69204669 6e616c20 46616e74 61737920 58494949 2d32222c 22636174 65676f72 79223a22 5b205053 332c2058 33363020 5d222c22 696d6755 726c223a 22494d41 4745535c 2f323031 325c2f30 335c2f30 30303038 3431325f 31353078 38302e6a 7067227d 2c7b2274 69746c65 223a2249 6e206172 7269766f 20717565 73746f20 41757475 6e6e6f20 45706963 204d6963 6b657920 323a2054 68652050 6f776572 206f6620 54776f22 2c226361 7465676f 7279223a 225b2050 53332c20 5769692c 20583336 30205d22 2c22696d 6755726c 223a2249 4d414745 535c2f32 3031325c 2f30335c 2f303030 30383430 385f3135 30783830 2e6a7067 227d5d> 2012-03-28 00:43:58.410 VGCollect[5583:f803] B) (null) instead that is how i serialize my crypted JSON NSData *decryptBase64 = [GTMBase64 decodeData:response]; NSData *decrypt3DES = [Crypt TripleDES:decryptBase64 encryptOrDecrypt:kCCDecrypt key:@"2b9534b45611cbb2436e625d"]; NSLog(@"A) %@", decrypt3DES ); NSMutableArray *json = [NSJSONSerialization JSONObjectWithData:decrypt3DES options:kNilOptions error:&error]; NSLog(@"B) %@", error); and that is the NSLog output 2012-03-28 00:41:51.525 VGCollect[5529:f803] A) <5b7b2274 69746c65 223a2252 696c6173 63696174 61206c61 20707269 6d612045 7370616e 73696f6e 65206469 20526167 6e61726f 6b204f64 79737365 79222c22 63617465 676f7279 223a225b 20505356 69746120 5d222c22 696d6755 726c223a 22494d41 4745535c 2f323031 325c2f30 335c2f30 30303038 3536345f 31353078 38302e6a 7067227d 2c7b2274 69746c65 223a2250 68616e74 61737920 53746172 204f6e6c 696e6520 3220616e 63686520 73752069 4f532065 20416e64 726f6964 20222c22 63617465 676f7279 223a225b 20495048 4f4e452c 2050432c 20505356 69746120 5d222c22 696d6755 726c223a 22494d41 4745535c 2f323031 325c2f30 335c2f30 30303038 3437355f 31353078 38302e6a 7067227d 2c7b2274 69746c65 223a2250 72696d69 20536372 65656e73 686f7420 696e2061 6c747261 20726973 6f6c757a 696f6e65 20706572 202e6861 636b5c2f 5c2f7665 72737573 222c2263 61746567 6f727922 3a225b20 50533320 5d222c22 696d6755 726c223a 22494d41 4745535c 2f323031 325c2f30 335c2f30 30303038 3437325f 31353078 38302e6a 7067227d 2c7b2274 69746c65 223a2269 6c206e75 6f766f20 444c4320 64692046 696e616c 2046616e 74617379 20584949 492d3220 636f6e66 65726d61 746f2061 6e636865 20696e20 416d6572 69636122 2c226361 7465676f 7279223a 225b2050 53332c20 58333630 205d222c 22696d67 55726c22 3a22494d 41474553 5c2f3230 31325c2f 30335c2f 30303030 38343439 5f313530 7838302e 6a706722 7d2c7b22 7469746c 65223a22 52656769 73747261 746f2069 6c206d61 72636869 6f205461 6c657320 6f662058 696c6c69 6120696e 20457572 6f706122 2c226361 7465676f 7279223a 225b2050 5333205d 222c2269 6d675572 6c223a22 494d4147 45535c2f 32303132 5c2f3033 5c2f3030 30303834 34385f31 35307838 302e6a70 67227d2c 7b227469 746c6522 3a225376 656c6174 69202e68 61636b5c 2f5c2f56 65727375 732e2065 202e6861 636b5c2f 5c2f5365 6b616920 6e6f204d 756b6f75 206e6922 2c226361 7465676f 7279223a 225b2050 5333205d 222c2269 6d675572 6c223a22 494d4147 45535c2f 32303132 5c2f3033 5c2f3030 30303834 33345f31 35307838 302e6a70 67227d2c 7b227469 746c6522 3a22476f 6e205061 6b752050 616b7520 50616b75 2050616b 75204164 76656e74 75726520 756e206e 756f766f 2067696f 636f2064 69204e61 6d636f20 42616e64 6169222c 22636174 65676f72 79223a22 5b203344 53205d22 2c22696d 6755726c 223a2249 4d414745 535c2f32 3031325c 2f30335c 2f303030 30383432 385f3135 30783830 2e6a7067 227d2c7b 22746974 6c65223a 224e756f 76652069 6d6d6167 696e6920 6520696e 666f2064 69204669 72652045 6d626c65 6d204177 616b656e 696e6722 2c226361 7465676f 7279223a 225b2033 4453205d 222c2269 6d675572 6c223a22 494d4147 45535c2f 32303132 5c2f3033 5c2f3030 30303834 31365f31 35307838 302e6a70 67227d2c 7b227469 746c6522 3a225072 696d6520 696d6d61 67696e69 20646569 206e756f 76692044 4c432064 69204669 6e616c20 46616e74 61737920 58494949 2d32222c 22636174 65676f72 79223a22 5b205053 332c2058 33363020 5d222c22 696d6755 726c223a 22494d41 4745535c 2f323031 325c2f30 335c2f30 30303038 3431325f 31353078 38302e6a 7067227d 2c7b2274 69746c65 223a2249 6e206172 7269766f 20717565 73746f20 41757475 6e6e6f20 45706963 204d6963 6b657920 323a2054 68652050 6f776572 206f6620 54776f22 2c226361 7465676f 7279223a 225b2050 53332c20 5769692c 20583336 30205d22 2c22696d 6755726c 223a2249 4d414745 535c2f32 3031325c 2f30335c 2f303030 30383430 385f3135 30783830 2e6a7067 227d5d00> 2012-03-28 00:41:51.530 VGCollect[5529:f803] B) Error Domain=NSCocoaErrorDomain Code=3840 "The operation couldn’t be completed. (Cocoa error 3840.)" (Garbage at end.) UserInfo=0x6ccdd40 {NSDebugDescription=Garbage at end.} as u can see the only difference between 2 NSlogs is 00 at the end of second hexcode and the error on serialization of it do u know how fix that ? idk if is needed, that is my script for encrypt data +(NSData*)TripleDES:(NSData*)plainData encryptOrDecrypt:(CCOperation)encryptOrDecrypt key:(NSString*)key { const void *vplainText; size_t plainTextBufferSize; plainTextBufferSize = [plainData length]; vplainText = (const void *)[plainData bytes]; CCCryptorStatus ccStatus; uint8_t *bufferPtr = NULL; size_t bufferPtrSize = 0; size_t movedBytes = 0; // uint8_t ivkCCBlockSize3DES; bufferPtrSize = (plainTextBufferSize + kCCBlockSize3DES) & ~(kCCBlockSize3DES - 1); bufferPtr = malloc( bufferPtrSize * sizeof(uint8_t)); memset((void *)bufferPtr, 0x0, bufferPtrSize); // memset((void *) iv, 0x0, (size_t) sizeof(iv)); // NSString *key = @"123456789012345678901234"; NSString *initVec = @"init Vec"; const void *vkey = (const void *) [key UTF8String]; const void *vinitVec = (const void *) [initVec UTF8String]; ccStatus = CCCrypt(encryptOrDecrypt, kCCAlgorithm3DES, (kCCOptionPKCS7Padding | kCCOptionECBMode) , vkey, //"123456789012345678901234", //key kCCKeySize3DES, vinitVec, //"init Vec", //iv, vplainText, //"Your Name", //plainText, plainTextBufferSize, (void *)bufferPtr, bufferPtrSize, &movedBytes); NSData *result = [NSData dataWithBytes:(const void *)bufferPtr length:(NSUInteger)movedBytes]; return result; }

    Read the article

  • FullCalender with JQuery and Google Calender

    - by Marv
    Hi, i got a problem with fullcalender and google calender. My fullcalender does not show the google entrys. Need help plz :) <script type="text/javascript">// <![CDATA[ $(document).ready(function() { var date = new Date(); var d = date.getDate(); var m = date.getMonth(); var y = date.getFullYear(); $('#calendar').fullCalendar({ header: { left: 'prev,next today', center: 'title', right: 'month,basicWeek,basicDay' }, }); $('#calendar').fullCalendar({ events: $.fullCalendar.gcalFeed( "http://www.google.com/calendar/feeds/marvin.spies%40gmx.de/public/basic/", { // put your options here className: 'gcal-event', editable: true, currentTimezone: 'Europe/Berlin' } ) }); }); // ]]  

    Read the article

  • How to create a Word document from a Silverlight 4 application

    - by George Durzi
    I'm looking for some options to programmatically create a Word document from within a Silverlight 4 application. I found two approaches which seemed promising at first but don't look like they will work. OpenXML SDK The OpenXML SDK isn't available for Silverlight at this time. Word Automation via COM Interop dynamic wordApplication = AutomationFactory.CreateObject("Word.Application"); Apparently this requires that the Silverlight 4 application be granted permission to run with elevated privileges, which is only available for out-of-browser applications (which ours isn't) My other thought is to hand off the request to a back-end service which doesn't have these limitations. Wanted to check for any ideas before going down that path. Thank you!

    Read the article

  • Winforms ComboBox autocomplete search multiple parts of string

    - by studiothat
    Very similar question to this one... http://stackoverflow.com/questions/522521/autocomplete-for-combobox-in-wpf-anywhere-in-text-not-just-beginning but my issue is for windows-forms rather than WPF. I have a winforms databound combox working great with autocomplete list coming from the data items in the combobox. Of course the client wants it to work "better", and that means that they want the autocomplete to work by searching and showing autocomplete options for any matching (contains()) string not just the starting string (startswith()) I know it's probably not just a property that can be set in the combobox, but can anyone point me in the right direction?

    Read the article

  • CACLS Confusion

    - by codeulike
    During my NSIS setup script for a WinForms app, I use the following CACLS command to give the Users group full rights to a subfolder: Exec 'CACLS "$INSTDIR\SubFolder" /E /T /C /G "Users":F' So in effect the CACLS command executed is something like: CACLS "c:\Program Files\MyApp\SubFolder" /E /T /C /G "Users":F When I then look at the Folder permissions in Windows Explorer (right click on the folder and choose Properties, go to the Security tab), the correct permissions are there but they are uneditable. Furthermore, clicking the Advanced button for the 'Advanced Security Settings' shows that SubFolder is inheriting the "Users" group permissions from a 'Parent Object', but what is that Parent Object, because its not the folder above. Why are the permissions added by CACLS uneditable, and why are they inherited from nonexistent parent object? I thinking I may have set the options on CACLS wrong. I'm on Windows XP.

    Read the article

  • How do I create an Access 2003 MDE programmatically or by command line in Access 2007?

    - by Ned Ryerson
    I have a legacy Access 2003 database file that must remain in that format to preserve its menus and toolbars. I have recently moved to Access 2007 in my build environment and will be deploying the compiled Access 2003 program with the Access 2007 runtime. In Access 2003, I could script the process of creating an MDE with the Access Developer Extensions (WZADE.mde) using the command line and an .xml file of build preferences (without creating an install package). The Access 2007 developer extensions do not seem to offer a similar option. I can "Package a Solution", but it creates an accdr and buries it in a CD installer. I've tried programmatic options like Docmd.RunCommand acMakeMDEFILe and Syscmd(603, mdbpath, mdepath) but they no longer work in Access 2007. Of course, i can manually create an MDE using Database ToolsCreate MDE, but that is no scriptable as far as I can tell.

    Read the article

  • Is the usage of Isolated Storage in Silverlight 3 a security concern

    - by Prashant
    I am using Silverlight 3 on my website. I have a Login Page for role based authentication, that routes users with different privileges to different parts of the website. I want to use something analogous to the Session Variables available in standard ASP.Net applications. I intend to use Isolated Storage to achieve this. But I am skeptical about security in this option, as the Isolated Storage exists on the client side, and can be manipulated on client side. I am new to the Isolated Storage concept and don't know about the security options provided by it in terms of Encryption and server-side validation etc. If any of you have used it or are aware of the security provided in this case, could you please shed some light on the same. Thanks

    Read the article

  • Export Websphere 6.1 Profile with datasource configuration

    - by Virmundi
    Hello, I'm trying to export a profile from WAS 6.1 so that I can give it to other members of my team with all of the JNDI and Shared Library configurations in place. I've flowed a few IBM tutorials on this like http://www-01.ibm.com/support/docview.wss?uid=swg21322309 (technically that is more a bug fix, but there is a similar page). I've tried to export the server using the "import" feature of the server in RAD 7. None of these options create a .car file with the resources sticking around. Does anyone know how to do this? Thanks, JPD

    Read the article

  • Configuring gcc compiler switches in Qt / QtCreator / QMake

    - by andand
    I recently tried to use Qt Creator 1.3.2 / Qt 4.6.2 / gcc 4.4.0 (32-bit version) on Windows 7 (64-bit) to compile an application using some of the experimental C++0x extensions and encountered the following (fatal) error: This file requires compiler and library support for the upcoming ISO C++ standard, C++0x. This support is currently experimental, and must be enabled with the -std=c++0x or -std=gnu++0x compiler options. In my search for a solution, I came across this thread, and added the following to the .pro file: CXXFLAGS += -std=c++0x but that didn't seem to make a difference. So, I expect there's some tag I need to add to the .pro (project) file, but I've never messed with the gcc compiler switches in Qt / QMake / QtCreator before, and am uncertain about the proper invokation / incantation. So, my question is how do you set gcc compiler switches when using QtCreator / QMake / Qt?

    Read the article

  • Show choosen option in a notification Feed, Django

    - by apoo
    Hey I have a model where : LIST_OPTIONS = ( ('cheap','cheap'), ('expensive','expensive'), ('normal', 'normal'), ) then I have assigned the LIST_OPTIONS to nature variable. nature = models.CharField(max_length=15, choices=LIST_OPTIONS, null=False, blank=False). then I save it: if self.pk: new=False else: new=True super(Listing, self).save(force_insert, force_update) if new and notification: notification.send(User.objects.all().exclude(id=self.owner.id), "listing_new", {'listing':self, }, ) then in my management.py: def create_notice_types(app, created_models,verbosity, **kwargs): notification.create_notice_type("listing_new", _("New Listing"), _("someone has posted a new listing"), default=2) and now in my notice.html I want to show to users different sentences based on the options that they have choose so something like this: LINK href="{{ listing.owner.get_absolute_url }} {{listing.owner}} {% ifequal listing.nature "For Sale" %} created a {{ listing.nature }} listing, <a href="{{ listing.get_absolute_url }}">{{listing.title}}</a>. {% ifequals listing.equal "Give Away"%} is {{ listing.nature }} , LINK href="{{ listing.get_absolute_url }}" {{listing.title}}. {% ifequal listing.equal "Looking For"%} is {{ listing.nature }} , LINK href="{{ listing.get_absolute_url }}" {{listing.title}} {% endifequal %} {% endifequal %} {% endifequal %} Could you please help me out with this. Thank you

    Read the article

  • Windows Azure: Announcing release of Windows Azure SDK 2.2 (with lots of goodies)

    - by ScottGu
    Earlier today I blogged about a big update we made today to Windows Azure, and some of the great new features it provides. Today I’m also excited to also announce the release of the Windows Azure SDK 2.2. Today’s SDK release adds even more great features including: Visual Studio 2013 Support Integrated Windows Azure Sign-In support within Visual Studio Remote Debugging Cloud Services with Visual Studio Firewall Management support within Visual Studio for SQL Databases Visual Studio 2013 RTM VM Images for MSDN Subscribers Windows Azure Management Libraries for .NET Updated Windows Azure PowerShell Cmdlets and ScriptCenter The below post has more details on what’s available in today’s Windows Azure SDK 2.2 release.  Also head over to Channel 9 to see the new episode of the Visual Studio Toolbox show that will be available shortly, and which highlights these features in a video demonstration. Visual Studio 2013 Support Version 2.2 of the Window Azure SDK is the first official version of the SDK to support the final RTM release of Visual Studio 2013. If you installed the 2.1 SDK with the Preview of Visual Studio 2013 we recommend that you upgrade your projects to SDK 2.2.  SDK 2.2 also works side by side with the SDK 2.0 and SDK 2.1 releases on Visual Studio 2012: Integrated Windows Azure Sign In within Visual Studio Integrated Windows Azure Sign-In support within Visual Studio is one of the big improvements added with this Windows Azure SDK release.  Integrated sign-in support enables developers to develop/test/manage Windows Azure resources within Visual Studio without having to download or use management certificates.  You can now just right-click on the “Windows Azure” icon within the Server Explorer inside Visual Studio and choose the “Connect to Windows Azure” context menu option to connect to Windows Azure: Doing this will prompt you to enter the email address of the account you wish to sign-in with: You can use either a Microsoft Account (e.g. Windows Live ID) or an Organizational account (e.g. Active Directory) as the email.  The dialog will update with an appropriate login prompt depending on which type of email address you enter: Once you sign-in you’ll see the Windows Azure resources that you have permissions to manage show up automatically within the Visual Studio Server Explorer (and you can start using them): With this new integrated sign in experience you are now able to publish web apps, deploy VMs and cloud services, use Windows Azure diagnostics, and fully interact with your Windows Azure services within Visual Studio without the need for a management certificate.  All of the authentication is handled using the Windows Azure Active Directory associated with your Windows Azure account (details on this can be found in my earlier blog post). Integrating authentication this way end-to-end across the Service Management APIs + Dev Tools + Management Portal + PowerShell automation scripts enables a much more secure and flexible security model within Windows Azure, and makes it much more convenient to securely manage multiple developers + administrators working on a project.  It also allows organizations and enterprises to use the same authentication model that they use for their developers on-premises in the cloud.  It also ensures that employees who leave an organization immediately lose access to their company’s cloud based resources once their Active Directory account is suspended. Filtering/Subscription Management Once you login within Visual Studio, you can filter which Windows Azure subscriptions/regions are visible within the Server Explorer by right-clicking the “Filter Services” context menu within the Server Explorer.  You can also use the “Manage Subscriptions” context menu to mange your Windows Azure Subscriptions: Bringing up the “Manage Subscriptions” dialog allows you to see which accounts you are currently using, as well as which subscriptions are within them: The “Certificates” tab allows you to continue to import and use management certificates to manage Windows Azure resources as well.  We have not removed any functionality with today’s update – all of the existing scenarios that previously supported management certificates within Visual Studio continue to work just fine.  The new integrated sign-in support provided with today’s release is purely additive. Note: the SQL Database node and the Mobile Service node in Server Explorer do not support integrated sign-in at this time. Therefore, you will only see databases and mobile services under those nodes if you have a management certificate to authorize access to them.  We will enable them with integrated sign-in in a future update. Remote Debugging Cloud Resources within Visual Studio Today’s Windows Azure SDK 2.2 release adds support for remote debugging many types of Windows Azure resources. With live, remote debugging support from within Visual Studio, you are now able to have more visibility than ever before into how your code is operating live in Windows Azure.  Let’s walkthrough how to enable remote debugging for a Cloud Service: Remote Debugging of Cloud Services To enable remote debugging for your cloud service, select Debug as the Build Configuration on the Common Settings tab of your Cloud Service’s publish dialog wizard: Then click the Advanced Settings tab and check the Enable Remote Debugging for all roles checkbox: Once your cloud service is published and running live in the cloud, simply set a breakpoint in your local source code: Then use Visual Studio’s Server Explorer to select the Cloud Service instance deployed in the cloud, and then use the Attach Debugger context menu on the role or to a specific VM instance of it: Once the debugger attaches to the Cloud Service, and a breakpoint is hit, you’ll be able to use the rich debugging capabilities of Visual Studio to debug the cloud instance remotely, in real-time, and see exactly how your app is running in the cloud. Today’s remote debugging support is super powerful, and makes it much easier to develop and test applications for the cloud.  Support for remote debugging Cloud Services is available as of today, and we’ll also enable support for remote debugging Web Sites shortly. Firewall Management Support with SQL Databases By default we enable a security firewall around SQL Databases hosted within Windows Azure.  This ensures that only your application (or IP addresses you approve) can connect to them and helps make your infrastructure secure by default.  This is great for protection at runtime, but can sometimes be a pain at development time (since by default you can’t connect/manage the database remotely within Visual Studio if the security firewall blocks your instance of VS from connecting to it). One of the cool features we’ve added with today’s release is support that makes it easy to enable and configure the security firewall directly within Visual Studio.  Now with the SDK 2.2 release, when you try and connect to a SQL Database using the Visual Studio Server Explorer, and a firewall rule prevents access to the database from your machine, you will be prompted to add a firewall rule to enable access from your local IP address: You can simply click Add Firewall Rule and a new rule will be automatically added for you. In some cases, the logic to detect your local IP may not be sufficient (for example: you are behind a corporate firewall that uses a range of IP addresses) and you may need to set up a firewall rule for a range of IP addresses in order to gain access. The new Add Firewall Rule dialog also makes this easy to do.  Once connected you’ll be able to manage your SQL Database directly within the Visual Studio Server Explorer: This makes it much easier to work with databases in the cloud. Visual Studio 2013 RTM Virtual Machine Images Available for MSDN Subscribers Last week we released the General Availability Release of Visual Studio 2013 to the web.  This is an awesome release with a ton of new features. With today’s Windows Azure update we now have a set of pre-configured VM images of VS 2013 available within the Windows Azure Management Portal for use by MSDN customers.  This enables you to create a VM in the cloud with VS 2013 pre-installed on it in with only a few clicks: Windows Azure now provides the fastest and easiest way to get started doing development with Visual Studio 2013. Windows Azure Management Libraries for .NET (Preview) Having the ability to automate the creation, deployment, and tear down of resources is a key requirement for applications running in the cloud.  It also helps immensely when running dev/test scenarios and coded UI tests against pre-production environments. Today we are releasing a preview of a new set of Windows Azure Management Libraries for .NET.  These new libraries make it easy to automate tasks using any .NET language (e.g. C#, VB, F#, etc).  Previously this automation capability was only available through the Windows Azure PowerShell Cmdlets or to developers who were willing to write their own wrappers for the Windows Azure Service Management REST API. Modern .NET Developer Experience We’ve worked to design easy-to-understand .NET APIs that still map well to the underlying REST endpoints, making sure to use and expose the modern .NET functionality that developers expect today: Portable Class Library (PCL) support targeting applications built for any .NET Platform (no platform restriction) Shipped as a set of focused NuGet packages with minimal dependencies to simplify versioning Support async/await task based asynchrony (with easy sync overloads) Shared infrastructure for common error handling, tracing, configuration, HTTP pipeline manipulation, etc. Factored for easy testability and mocking Built on top of popular libraries like HttpClient and Json.NET Below is a list of a few of the management client classes that are shipping with today’s initial preview release: .NET Class Name Supports Operations for these Assets (and potentially more) ManagementClient Locations Credentials Subscriptions Certificates ComputeManagementClient Hosted Services Deployments Virtual Machines Virtual Machine Images & Disks StorageManagementClient Storage Accounts WebSiteManagementClient Web Sites Web Site Publish Profiles Usage Metrics Repositories VirtualNetworkManagementClient Networks Gateways Automating Creating a Virtual Machine using .NET Let’s walkthrough an example of how we can use the new Windows Azure Management Libraries for .NET to fully automate creating a Virtual Machine. I’m deliberately showing a scenario with a lot of custom options configured – including VHD image gallery enumeration, attaching data drives, network endpoints + firewall rules setup - to show off the full power and richness of what the new library provides. We’ll begin with some code that demonstrates how to enumerate through the built-in Windows images within the standard Windows Azure VM Gallery.  We’ll search for the first VM image that has the word “Windows” in it and use that as our base image to build the VM from.  We’ll then create a cloud service container in the West US region to host it within: We can then customize some options on it such as setting up a computer name, admin username/password, and hostname.  We’ll also open up a remote desktop (RDP) endpoint through its security firewall: We’ll then specify the VHD host and data drives that we want to mount on the Virtual Machine, and specify the size of the VM we want to run it in: Once everything has been set up the call to create the virtual machine is executed asynchronously In a few minutes we’ll then have a completely deployed VM running on Windows Azure with all of the settings (hard drives, VM size, machine name, username/password, network endpoints + firewall settings) fully configured and ready for us to use: Preview Availability via NuGet The Windows Azure Management Libraries for .NET are now available via NuGet. Because they are still in preview form, you’ll need to add the –IncludePrerelease switch when you go to retrieve the packages. The Package Manager Console screen shot below demonstrates how to get the entire set of libraries to manage your Windows Azure assets: You can also install them within your .NET projects by right clicking on the VS Solution Explorer and using the Manage NuGet Packages context menu command.  Make sure to select the “Include Prerelease” drop-down for them to show up, and then you can install the specific management libraries you need for your particular scenarios: Open Source License The new Windows Azure Management Libraries for .NET make it super easy to automate management operations within Windows Azure – whether they are for Virtual Machines, Cloud Services, Storage Accounts, Web Sites, and more.  Like the rest of the Windows Azure SDK, we are releasing the source code under an open source (Apache 2) license and it is hosted at https://github.com/WindowsAzure/azure-sdk-for-net/tree/master/libraries if you wish to contribute. PowerShell Enhancements and our New Script Center Today, we are also shipping Windows Azure PowerShell 0.7.0 (which is a separate download). You can find the full change log here. Here are some of the improvements provided with it: Windows Azure Active Directory authentication support Script Center providing many sample scripts to automate common tasks on Windows Azure New cmdlets for Media Services and SQL Database Script Center Windows Azure enables you to script and automate a lot of tasks using PowerShell.  People often ask for more pre-built samples of common scenarios so that they can use them to learn and tweak/customize. With this in mind, we are excited to introduce a new Script Center that we are launching for Windows Azure. You can learn about how to scripting with Windows Azure with a get started article. You can then find many sample scripts across different solutions, including infrastructure, data management, web, and more: All of the sample scripts are hosted on TechNet with links from the Windows Azure Script Center. Each script is complete with good code comments, detailed descriptions, and examples of usage. Summary Visual Studio 2013 and the Windows Azure SDK 2.2 make it easier than ever to get started developing rich cloud applications. Along with the Windows Azure Developer Center’s growing set of .NET developer resources to guide your development efforts, today’s Windows Azure SDK 2.2 release should make your development experience more enjoyable and efficient. If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Lazarus: Can't find unit [unit] used by [program]

    - by Ree
    I'm trying to use an external library (wingraph) in a simple program. I have .o and .ppu files. I added the directory that contains them to the list of both "Other Unit Files" and "Include Files" paths under Project-Compiler Options. When building, I still get the error "Can't find unit wingraph used by [program]". The library is Windows specific and I'm compiling on Windows, too. What should I do to solve the problem? Note that I don't have extensive knowledge about Pascal itself nor its tools. I'm just trying to quickly help someone start using the library.

    Read the article

  • Missing Menu Bar in Visual Studio 2010

    - by Jeremy Sullivan
    When I opened up Visual Studio 2010 this morning, my Menu Bar (you know, the bar with File, Edit, etc. on it) was missing. I've tried all of the right-click menus, customize options, Function keys and Google searches that I can think of but to no avail. This is exactly the kind of humiliation that brings sobriety to my ever-growing ego as a developer. I'm actually posting this on StackOverflow.com for the world to see. Next, I will be just as humiliated when someone replies to this and tells me precisely how simple it is and how ignorant that I am for not knowing the simple key combination or finding it on Google. Thank you, your majesty, for your sharp retort in advance.

    Read the article

  • What is best way to raise application events for use with admin network monitoring tools

    - by J Cooper
    In a semi-critical .NET service application, what would be a good strategy for raising application events that could be monitored by network administration tools? The events would be for errors, status changes, and possibly other notifications. My company is planning to use some kind of tool down the road to monitor all critical machines and services. As of right now they are using Spice Works to do some monitoring but it is not known if they will keep this down the road. By strategy I mean, perhaps using some sort of protocol ( my network admin has mentioned SNMP), perhaps a service such as windows event log. I have no idea what is available, so I'm leaving the options open. With that in mind, here is a list of preferences I came up with: Somewhat easy to use with .NET. Reliability Should work well with a variety of admin monitoring tools Works with non - windows monitoring tools Works with Spice Works

    Read the article

  • Sinatra 1.0 fastcgi deployment

    - by TheMoonMaster
    I am trying to deploy my sinatra app to my hosting(shared) and I keep getting this error. /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/handler/fastcgi.rb:23:in `initialize': Address family not supported by protocol - socket(2) (Errno::EAFNOSUPPORT) from /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/handler/fastcgi.rb:23:in `new' from /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/handler/fastcgi.rb:23:in `run' from /usr/lib/ruby/gems/1.8/gems/sinatra-1.0/lib/sinatra/base.rb:946:in `run!' from /usr/lib/ruby/gems/1.8/gems/sinatra-1.0/lib/sinatra/main.rb:25 from dispatch.fcgi:17 I have no idea what this means and I have tried many different things to fix it but nothing I tried seemed to work. My dispatch.fcgi is the following #!/usr/bin/ruby require 'rubygems' require 'sinatra' fastcgi_log = File.open("fastcgi.log", "a") STDOUT.reopen fastcgi_log STDERR.reopen fastcgi_log STDOUT.sync = true set :logging, false set :server, "FastCGI" load 'simple.rb' And finally, my .htaccess (fcgid is how my host told me to set it up) RewriteEngine on AddHandler fcgid-script .fcgi Options +FollowSymLinks +ExecCGI RewriteRule ^(.*)$ dispatch.fcgi [QSA,L]

    Read the article

  • curl object moved here error, bash

    - by adam n
    I'm trying to download an html file with curl in bash. When I download it manually, it works fine. However, when i try and run my script through crontab, the output html file is very small and just says "Object moved to here." with a broken link. Does this have something to do with the sparse environment the crontab commands run it? I found this question: http://stackoverflow.com/questions/1279340/php-ssl-curl-object-moved-error but i'm using bash, not php. What are the equivalent command line options or variables to set to fix this problem in bash? (I want to do this with curl, not wget)

    Read the article

  • Grub 'Read Error' - Only Loads with LiveCD

    - by Ryan Sharp
    Problem After installing Ubuntu to complete my Windows 7/Ubuntu 12.04 dual-boot setup, Grub just wouldn't load at all unless I boot from the LiveCD. Afterwards, everything works completely normal. However, this workaround isn't a solution and I'd like to be able to boot without the aid of a disc. Fdisk -l Using the fdisk -l command, I am given the following: Disk /dev/sda: 64.0 GB, 64023257088 bytes 255 heads, 63 sectors/track, 7783 cylinders, total 125045424 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x324971d1 Device Boot Start End Blocks Id System /dev/sda1 2048 206847 102400 7 HPFS/NTFS/exFAT /dev/sda2 208896 48957439 24374272 7 HPFS/NTFS/exFAT /dev/sda3 * 48959486 124067839 37554177 5 Extended /dev/sda5 48959488 124067839 37554176 83 Linux Disk /dev/sdb: 1000.2 GB, 1000204886016 bytes 255 heads, 63 sectors/track, 121601 cylinders, total 1953525168 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0xc0ee6a69 Device Boot Start End Blocks Id System /dev/sdb1 1024208894 1953523711 464657409 5 Extended /dev/sdb3 * 2048 1024206847 512102400 7 HPFS/NTFS/exFAT /dev/sdb5 1024208896 1937897471 456844288 83 Linux /dev/sdb6 1937899520 1953523711 7812096 82 Linux swap / Solaris Partition table entries are not in disk order Disk /dev/sdc: 320.1 GB, 320072933376 bytes 255 heads, 63 sectors/track, 38913 cylinders, total 625142448 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x292eee23 Device Boot Start End Blocks Id System /dev/sdc1 2048 625141759 312569856 7 HPFS/NTFS/exFAT Bootinfoscript I've used the BootInfoScript, and received the following output: Boot Info Script 0.61 [1 April 2012] ============================= Boot Info Summary: =============================== => Grub2 (v1.99) is installed in the MBR of /dev/sda and looks at sector 1 of the same hard drive for core.img. core.img is at this location and looks for (,msdos5)/boot/grub on this drive. => Grub2 (v1.99) is installed in the MBR of /dev/sdb and looks at sector 1 of the same hard drive for core.img. core.img is at this location and looks for (,msdos5)/boot/grub on this drive. => Windows is installed in the MBR of /dev/sdc. sda1: __________________________________________________________________________ File system: ntfs Boot sector type: Windows Vista/7: NTFS Boot sector info: No errors found in the Boot Parameter Block. Operating System: Boot files: /bootmgr /Boot/BCD sda2: __________________________________________________________________________ File system: ntfs Boot sector type: Windows Vista/7: NTFS Boot sector info: No errors found in the Boot Parameter Block. Operating System: Windows 7 Boot files: /bootmgr /Boot/BCD /Windows/System32/winload.exe sda3: __________________________________________________________________________ File system: Extended Partition Boot sector type: Unknown Boot sector info: sda5: __________________________________________________________________________ File system: ext4 Boot sector type: - Boot sector info: Operating System: Ubuntu 12.04.1 LTS Boot files: /boot/grub/grub.cfg /etc/fstab /boot/grub/core.img sdb1: __________________________________________________________________________ File system: Extended Partition Boot sector type: - Boot sector info: sdb5: __________________________________________________________________________ File system: ext4 Boot sector type: - Boot sector info: Operating System: Boot files: sdb6: __________________________________________________________________________ File system: swap Boot sector type: - Boot sector info: sdb3: __________________________________________________________________________ File system: ntfs Boot sector type: Windows Vista/7: NTFS Boot sector info: According to the info in the boot sector, sdb3 starts at sector 200744960. But according to the info from fdisk, sdb3 starts at sector 2048. According to the info in the boot sector, sdb3 has 823461887 sectors, but according to the info from fdisk, it has 1024204799 sectors. Operating System: Boot files: sdc1: __________________________________________________________________________ File system: ntfs Boot sector type: Windows Vista/7: NTFS Boot sector info: No errors found in the Boot Parameter Block. Operating System: Boot files: ============================ Drive/Partition Info: ============================= Drive: sda _____________________________________________________________________ Disk /dev/sda: 64.0 GB, 64023257088 bytes 255 heads, 63 sectors/track, 7783 cylinders, total 125045424 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes Partition Boot Start Sector End Sector # of Sectors Id System /dev/sda1 2,048 206,847 204,800 7 NTFS / exFAT / HPFS /dev/sda2 208,896 48,957,439 48,748,544 7 NTFS / exFAT / HPFS /dev/sda3 * 48,959,486 124,067,839 75,108,354 5 Extended /dev/sda5 48,959,488 124,067,839 75,108,352 83 Linux Drive: sdb _____________________________________________________________________ Disk /dev/sdb: 1000.2 GB, 1000204886016 bytes 255 heads, 63 sectors/track, 121601 cylinders, total 1953525168 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes Partition Boot Start Sector End Sector # of Sectors Id System /dev/sdb1 1,024,208,894 1,953,523,711 929,314,818 5 Extended /dev/sdb5 1,024,208,896 1,937,897,471 913,688,576 83 Linux /dev/sdb6 1,937,899,520 1,953,523,711 15,624,192 82 Linux swap / Solaris /dev/sdb3 * 2,048 1,024,206,847 1,024,204,800 7 NTFS / exFAT / HPFS Drive: sdc _____________________________________________________________________ Disk /dev/sdc: 320.1 GB, 320072933376 bytes 255 heads, 63 sectors/track, 38913 cylinders, total 625142448 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes Partition Boot Start Sector End Sector # of Sectors Id System /dev/sdc1 2,048 625,141,759 625,139,712 7 NTFS / exFAT / HPFS "blkid" output: ________________________________________________________________ Device UUID TYPE LABEL /dev/sda1 A48056DF8056B80E ntfs System Reserved /dev/sda2 A8C6D6A4C6D671D4 ntfs Windows /dev/sda5 fd71c537-3715-44e1-b1fe-07537e22b3dd ext4 /dev/sdb3 6373D03D0A3747A8 ntfs Steam /dev/sdb5 6f5a6eb3-a932-45aa-893e-045b57708270 ext4 /dev/sdb6 469848c8-867a-41b7-b0e1-b813a43c64af swap /dev/sdc1 725D7B961CF34B1B ntfs backup ================================ Mount points: ================================= Device Mount_Point Type Options /dev/sda5 / ext4 (rw,noatime,nodiratime,discard,errors=remount-ro) /dev/sdb5 /home ext4 (rw) =========================== sda5/boot/grub/grub.cfg: =========================== -------------------------------------------------------------------------------- # # DO NOT EDIT THIS FILE # # It is automatically generated by grub-mkconfig using templates # from /etc/grub.d and settings from /etc/default/grub # ### BEGIN /etc/grub.d/00_header ### if [ -s $prefix/grubenv ]; then set have_grubenv=true load_env fi set default="0" if [ "${prev_saved_entry}" ]; then set saved_entry="${prev_saved_entry}" save_env saved_entry set prev_saved_entry= save_env prev_saved_entry set boot_once=true fi function savedefault { if [ -z "${boot_once}" ]; then saved_entry="${chosen}" save_env saved_entry fi } function recordfail { set recordfail=1 if [ -n "${have_grubenv}" ]; then if [ -z "${boot_once}" ]; then save_env recordfail; fi; fi } function load_video { insmod vbe insmod vga insmod video_bochs insmod video_cirrus } insmod part_msdos insmod ext2 set root='(hd0,msdos5)' search --no-floppy --fs-uuid --set=root fd71c537-3715-44e1-b1fe-07537e22b3dd if loadfont /usr/share/grub/unicode.pf2 ; then set gfxmode=auto load_video insmod gfxterm insmod part_msdos insmod ext2 set root='(hd0,msdos5)' search --no-floppy --fs-uuid --set=root fd71c537-3715-44e1-b1fe-07537e22b3dd set locale_dir=($root)/boot/grub/locale set lang=en_GB insmod gettext fi terminal_output gfxterm if [ "${recordfail}" = 1 ]; then set timeout=-1 else set timeout=10 fi ### END /etc/grub.d/00_header ### ### BEGIN /etc/grub.d/05_debian_theme ### set menu_color_normal=white/black set menu_color_highlight=black/light-gray if background_color 44,0,30; then clear fi ### END /etc/grub.d/05_debian_theme ### ### BEGIN /etc/grub.d/10_linux ### function gfxmode { set gfxpayload="${1}" if [ "${1}" = "keep" ]; then set vt_handoff=vt.handoff=7 else set vt_handoff= fi } if [ "${recordfail}" != 1 ]; then if [ -e ${prefix}/gfxblacklist.txt ]; then if hwmatch ${prefix}/gfxblacklist.txt 3; then if [ ${match} = 0 ]; then set linux_gfx_mode=keep else set linux_gfx_mode=text fi else set linux_gfx_mode=text fi else set linux_gfx_mode=keep fi else set linux_gfx_mode=text fi export linux_gfx_mode if [ "${linux_gfx_mode}" != "text" ]; then load_video; fi menuentry 'Ubuntu, with Linux 3.2.0-29-generic' --class ubuntu --class gnu-linux --class gnu --class os { recordfail gfxmode $linux_gfx_mode insmod gzio insmod part_msdos insmod ext2 set root='(hd0,msdos5)' search --no-floppy --fs-uuid --set=root fd71c537-3715-44e1-b1fe-07537e22b3dd linux /boot/vmlinuz-3.2.0-29-generic root=UUID=fd71c537-3715-44e1-b1fe-07537e22b3dd ro quiet splash $vt_handoff initrd /boot/initrd.img-3.2.0-29-generic } menuentry 'Ubuntu, with Linux 3.2.0-29-generic (recovery mode)' --class ubuntu --class gnu-linux --class gnu --class os { recordfail insmod gzio insmod part_msdos insmod ext2 set root='(hd0,msdos5)' search --no-floppy --fs-uuid --set=root fd71c537-3715-44e1-b1fe-07537e22b3dd echo 'Loading Linux 3.2.0-29-generic ...' linux /boot/vmlinuz-3.2.0-29-generic root=UUID=fd71c537-3715-44e1-b1fe-07537e22b3dd ro recovery nomodeset echo 'Loading initial ramdisk ...' initrd /boot/initrd.img-3.2.0-29-generic } ### END /etc/grub.d/10_linux ### ### BEGIN /etc/grub.d/20_linux_xen ### ### END /etc/grub.d/20_linux_xen ### ### BEGIN /etc/grub.d/20_memtest86+ ### menuentry "Memory test (memtest86+)" { insmod part_msdos insmod ext2 set root='(hd0,msdos5)' search --no-floppy --fs-uuid --set=root fd71c537-3715-44e1-b1fe-07537e22b3dd linux16 /boot/memtest86+.bin } menuentry "Memory test (memtest86+, serial console 115200)" { insmod part_msdos insmod ext2 set root='(hd0,msdos5)' search --no-floppy --fs-uuid --set=root fd71c537-3715-44e1-b1fe-07537e22b3dd linux16 /boot/memtest86+.bin console=ttyS0,115200n8 } ### END /etc/grub.d/20_memtest86+ ### ### BEGIN /etc/grub.d/30_os-prober ### menuentry "Windows 7 (loader) (on /dev/sda1)" --class windows --class os { insmod part_msdos insmod ntfs set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set=root A48056DF8056B80E chainloader +1 } menuentry "Windows 7 (loader) (on /dev/sda2)" --class windows --class os { insmod part_msdos insmod ntfs set root='(hd0,msdos2)' search --no-floppy --fs-uuid --set=root A8C6D6A4C6D671D4 chainloader +1 } ### END /etc/grub.d/30_os-prober ### ### BEGIN /etc/grub.d/40_custom ### # This file provides an easy way to add custom menu entries. Simply type the # menu entries you want to add after this comment. Be careful not to change # the 'exec tail' line above. ### END /etc/grub.d/40_custom ### ### BEGIN /etc/grub.d/41_custom ### if [ -f $prefix/custom.cfg ]; then source $prefix/custom.cfg; fi ### END /etc/grub.d/41_custom ### -------------------------------------------------------------------------------- =============================== sda5/etc/fstab: ================================ -------------------------------------------------------------------------------- # /etc/fstab: static file system information. # # Use 'blkid' to print the universally unique identifier for a # device; this may be used with UUID= as a more robust way to name devices # that works even if disks are added and removed. See fstab(5). # # <file system> <mount point> <type> <options> <dump> <pass> proc /proc proc nodev,noexec,nosuid 0 0 # / was on /dev/sda5 during installation UUID=fd71c537-3715-44e1-b1fe-07537e22b3dd / ext4 noatime,nodiratime,discard,errors=remount-ro 0 1 # /home was on /dev/sdb5 during installation UUID=6f5a6eb3-a932-45aa-893e-045b57708270 /home ext4 defaults 0 2 # swap was on /dev/sdb6 during installation UUID=469848c8-867a-41b7-b0e1-b813a43c64af none swap sw 0 0 tmpfs /tmp tmpfs defaults,noatime,mode=1777 0 0 -------------------------------------------------------------------------------- =================== sda5: Location of files loaded by Grub: ==================== GiB - GB File Fragment(s) = boot/grub/core.img 1 = boot/grub/grub.cfg 1 = boot/initrd.img-3.2.0-29-generic 2 = boot/vmlinuz-3.2.0-29-generic 1 = initrd.img 2 = vmlinuz 1 ======================== Unknown MBRs/Boot Sectors/etc: ======================== Unknown BootLoader on sda3 00000000 63 6f 70 69 61 20 65 20 63 6f 6c 61 41 63 65 64 |copia e colaAced| 00000010 65 72 20 61 20 74 6f 64 6f 20 6f 20 74 65 78 74 |er a todo o text| 00000020 6f 20 66 61 6c 61 64 6f 20 75 74 69 6c 69 7a 61 |o falado utiliza| 00000030 6e 64 6f 20 61 20 63 6f 6e 76 65 72 73 c3 a3 6f |ndo a convers..o| 00000040 20 64 65 20 74 65 78 74 6f 20 70 61 72 61 20 76 | de texto para v| 00000050 6f 7a 4d 61 6e 69 70 75 6c 61 72 20 61 73 20 64 |ozManipular as d| 00000060 65 66 69 6e 69 c3 a7 c3 b5 65 73 20 71 75 65 20 |efini....es que | 00000070 63 6f 6e 74 72 6f 6c 61 6d 20 6f 20 61 63 65 73 |controlam o aces| 00000080 73 6f 20 64 65 20 57 65 62 73 69 74 65 73 20 61 |so de Websites a| 00000090 20 63 6f 6f 6b 69 65 73 2c 20 4a 61 76 61 53 63 | cookies, JavaSc| 000000a0 72 69 70 74 20 65 20 70 6c 75 67 2d 69 6e 73 4d |ript e plug-insM| 000000b0 61 6e 69 70 75 6c 61 72 20 61 73 20 64 65 66 69 |anipular as defi| 000000c0 6e 69 c3 a7 c3 b5 65 73 20 72 65 6c 61 63 69 6f |ni....es relacio| 000000d0 6e 61 64 61 73 20 63 6f 6d 20 70 72 69 76 61 63 |nadas com privac| 000000e0 69 64 61 64 65 41 63 65 64 65 72 20 61 6f 73 20 |idadeAceder aos | 000000f0 73 65 75 73 20 70 65 72 69 66 c3 a9 72 69 63 6f |seus perif..rico| 00000100 73 20 55 53 42 55 74 69 6c 69 7a 61 72 20 6f 20 |s USBUtilizar o | 00000110 73 65 75 20 6d 69 63 72 6f 66 6f 6e 65 55 74 69 |seu microfoneUti| 00000120 6c 69 7a 61 72 20 61 20 73 75 61 20 63 c3 a2 6d |lizar a sua c..m| 00000130 61 72 61 55 74 69 6c 69 7a 61 72 20 6f 20 73 65 |araUtilizar o se| 00000140 75 20 6d 69 63 72 6f 66 6f 6e 65 20 65 20 61 20 |u microfone e a | 00000150 63 c3 a2 6d 61 72 61 4e c3 a3 6f 20 66 6f 69 20 |c..maraN..o foi | 00000160 70 6f 73 73 c3 ad 76 65 6c 20 65 6e 63 6f 6e 74 |poss..vel encont| 00000170 72 61 72 20 6f 20 63 61 6d 69 6e 68 6f 20 61 62 |rar o caminho ab| 00000180 73 6f 6c 75 74 6f 20 70 61 72 61 20 6f 20 64 69 |soluto para o di| 00000190 72 65 63 74 c3 b3 72 69 6f 20 61 20 65 6d 70 61 |rect..rio a empa| 000001a0 63 6f 74 61 72 2e 4f 20 64 69 72 65 63 74 c3 b3 |cotar.O direct..| 000001b0 72 69 6f 20 64 65 20 65 6e 74 72 61 64 61 00 fe |rio de entrada..| 000001c0 ff ff 83 fe ff ff 02 00 00 00 00 10 7a 04 00 00 |............z...| 000001d0 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................| * 000001f0 00 00 00 00 00 00 00 00 00 00 00 00 00 00 55 aa |..............U.| 00000200 =============================== StdErr Messages: =============================== xz: (stdin): Compressed data is corrupt xz: (stdin): Compressed data is corrupt awk: cmd. line:36: Math support is not compiled in awk: cmd. line:36: Math support is not compiled in awk: cmd. line:36: Math support is not compiled in awk: cmd. line:36: Math support is not compiled in awk: cmd. line:36: Math support is not compiled in awk: cmd. line:36: Math support is not compiled in Begging / Appreciation ;) If anything else is required to solve my problem, please ask. My only hopes are that I can solve this, and that doing so won't require re-installation of Grub due to how complicated the procedures are, or that I would be needed to reinstall the OS', as I have done so about six times already since friday due to several other issues I've encountered. Thank you, and good day. System Ubuntu 12.04 64-bit / Windows 7 SP1 64-bit 64GB SSD as boot/OS drive, 1TB HDD as /Home Swap and Steam drive.

    Read the article

< Previous Page | 534 535 536 537 538 539 540 541 542 543 544 545  | Next Page >