Search Results

Search found 24353 results on 975 pages for 'test coverage'.

Page 583/975 | < Previous Page | 579 580 581 582 583 584 585 586 587 588 589 590  | Next Page >

  • How to to specify network address for a dhcp client

    - by drecute
    I have setup and configured a DHCP server on a sparc running Solaris 10. Now I want to test the DHCP server by creating a DHCP client on another computer running Solaris 11. I would like to know, how do I specify a network address for a dhcp client such that its generated ip address is within a specific subnet. For example: The DHCP server host = 172.1.1.1 So I want the client machine to have an IP Address in the range of 172.1.1.1 255.255.255.0. Please help me.

    Read the article

  • Testing UDP port connectivity

    - by Lock
    I am trying to test whether I can get to a particular port on a remote server (both of which I have access to) through UDP. Both servers are internet facing. I am using netcat to have a certain port listening. I then use nmap to check for that port to see if it is open, but it doesn't appear to be. Iptables is turned off. Any suggestions why this could be? I am eventually going to setup a VPN tunnel, but because I'm very new to tunnels, I want to make sure I have connectivity on port UDP 1194 before advancing.

    Read the article

  • Does anyone still use Iometer?

    - by Brian T Hannan
    "Iometer is an I/O subsystem measurement and characterization tool for single and clustered systems. It is used as a benchmark and troubleshooting tool and is easily configured to replicate the behaviour of many popular applications." link text Does anyone still use this tool? It seems helpful, but I'm not sure if it's for the thing I am trying to work on. I am trying create a benchmark computer performance test that can be run before and after a Windows Optimization program does its stuff (ex: PC Optimizer Pro or CCleaner). I want to be able to make a quick statement like CCleaner makes the computer run 50% faster or something along those lines. Are there any newer tools like this one?

    Read the article

  • Open a screen session inside a certain user on boot Ubuntu Server Linux

    - by Pez Cuckow
    I currently have a private server which I test my web apps on which runs Ubuntu Server 10.04 I also host a few game servers (rather than having wasted CPU time :-D) for some of my friends. These game servers I run in the game user account and each one has it's own screen session (so friends can ssh in and reboot the game server etc...). For example screen -R l4d2 runs ./start in the L4D2 folder. However if I reboot the server (which I have to do occasionally) all these sessions close and I have to manually create all the screen sessions and run the required games within them. Is there a way to set these screen sessions as Daemons or services or just boot on server start so they restart themselves on server reboot? I hope I have made my question easy to understand but feel free to ask questions! Many thanks,

    Read the article

  • Mailer Daemon greeting failed

    - by Xelluloid
    I wrote a tool that sends automated mails to a couple of addresses. This worked for a couple of weeks. Now since yesterday I get Mailer-Daemon responses like this Hi. This is the qmail-send program at test.test2.net. I'm afraid I wasn't able to deliver your message to the following addresses. This is a permanent error; I've given up. Sorry it didn't work out. testuser@domain.com: Connected to 123.456.789.10 but greeting failed. Remote host said: 554 foo.bar.com I'm not going to try again; this message has been in the queue too long. Does someone have an idea what I can do now?

    Read the article

  • OpenJDK6 At a Glance

    - by user9158633
    OpenJDK6 Quick Links Project:       Home Page  |  How to Contribute to OpenJdk |  Java SE 6 Spec Code:          Source Bundle Download  |  Mercurial Repositories: [.] , corba, hotspot, jaxp, jaxws, jdk, langtools Mailing List: [email protected]  |  Mail Archive  |  How to Subscribe Bloggers:     Joe Darcy (the founder and the first Release Manager)  |   Kelly O'Hair (current Release Manager) Blog Posts:  All Joe's OpenJDK 6 Posts  | Joe's FOSDEM Presentation Related Projects: IcedTea6  | OpenJDK 7 Important Notice: • Security fixes from Oracle will continue  through EOL of OpenJDK 6 train • EOL of OpenJDK 6 train will occur no sooner than July 2012 (one year after JDK 7 ships) OpenJDK 6 Releases Releases:  Release Process  |  Release Tools/Scripts Build Numbers  Release Engineer  Release Notes  Test Results  Change List  B01 - B22 Joe Darcy B22 Blog, src bundles B22 B22  B23 Kelly O'Hair B23 Blog, src bundles B23 B23  B24, and later Lana Steuck B24 Blog, src bundles B24 B24

    Read the article

  • UPDATE FOR BI PUBLISHER ENTERPRISE 10.1.3.4.2 NOVEMBER 2011

    - by Tim Dexter
    It's Friday, that means its patch release time. Why do we do this to ourselves, 'we'll release on Friday!' It might 11.59 on Friday but by golly we'll have released on Friday. I can remember a release of BIP years ago that for some reason we went for 12/31 as a release date ... were we mad? I seem to remember we made it but talk about ridiculous pressure! The latest 10g rollup is out in the wild and available from Oracle support. A bug fixing rollup but worth getting to and know that support will want you to get to it and re-test before going forward on an SR. One simple but very useful fix or enhancement:[Cause of the bug] @ ================== @ Customer reports that despite the clock being shown, end users are clicking @ on the View button repeatedly as the initial generation is taking some time.   @ If the button were to be grayed out then  this would prevent the users @ requesting the report more than  once.  Repeated requests are causing a @ system overload and as this is their Production  instance this is extremely @ important to the customer. @ . @ [The Fix] @ ========= @ Added the logic to disable the button after the user clicks on the "view" @ button and re-enable it when the report is loaded. I told a group of customers once that they have a headache and we have a non-steroidal anti-inflammatory drug, alright, I actually said 'aspirin'. This little gem of a fix helps relieve another little headache that our aspirin was causing. The patch number for all this BIP pain killing is 13399232, enjoy!

    Read the article

  • Motherboard rejects identical hard drive, one works the other doesn't

    - by Payson Welch
    I have an interesting situation. I have a Dell XS23-SB server that has four blades in it. The blades use Supermicro X7DWT motherboards, and interface with the sata drives through a backplane. I took two identical drives from a raid 0 enclosure that came from the factory (GDrive), one works on all four servers, the other does not. I verified that they both work by plugging them into a hard drive cradle. This behavior can be repeated with other drives, some drives work and some don't. However when i test them, they ALL work on my PC. What could cause this?

    Read the article

  • Why does my VertexDeclaration apparently not contain Position0?

    - by Phil
    I'm trying to get my code from calling each individual draw call down to using at least a VertexBuffer, and preferably an indexBuffer, but now that I'm attempting to test my code, I'm getting the error: The current vertex declaration does not include all the elements required by the current vertex shader. Position0 is missing. Which makes absolutely no sense to me, as my VertexDeclaration is: public readonly static VertexDeclaration VertexDeclaration = new VertexDeclaration( new VertexElement(0, VertexElementFormat.Vector3, VertexElementUsage.Position, 0), new VertexElement(sizeof(float) * 3, VertexElementFormat.Color, VertexElementUsage.Color, 0), new VertexElement(sizeof(float) * 3 + 4, VertexElementFormat.Vector3, VertexElementUsage.Normal, 0) ); Which clearly contains the information. I am attempting to draw with the following lines: VertexBuffer vb = new VertexBuffer(GraphicsDevice, VertexPositionColorNormal.VertexDeclaration, c.VertexList.Count, BufferUsage.WriteOnly); IndexBuffer ib = new IndexBuffer(GraphicsDevice, typeof(int), c.IndexList.Count, BufferUsage.WriteOnly); vb.SetData<VertexPositionColorNormal>(c.VertexList.ToArray()); ib.SetData<int>(c.IndexList.ToArray()); GraphicsDevice.DrawIndexedPrimitives(PrimitiveType.TriangleList, 0, 0, vb.VertexCount, 0, c.IndexList.Count/3); Where c is a Chunk class containing an 8x8x8 array of boxes. Full code is available at https://github.com/mrbaggins/Box/tree/ProperMeshing/box/box. Relevant locations are Chunk.cs (Contains the VertexDeclaration) and Game1.cs (Draw() is in Lines 230-250). Not much else of relevance to this problem anywhere else. Note that large commented sections are from old version of drawing.

    Read the article

  • Send mail from a distrobution groups email adress

    - by Campo
    A user has send permission on a distro group on a WINDOWS SERVER 2003 domain. I am the admin. When either of us send email using the distrobution groups email adress we get a non delivery report Your message did not reach some or all of the intended recipients. Subject: TEST Sent: 4/19/2010 4:46 PM The following recipient(s) cannot be reached: [email protected] on 4/19/2010 4:46 PM You do not have permission to send to this recipient. For assistance, contact your system administrator. MSEXCH:MSExchangeIS:/DC=local/DC=DOMAIN:SERVERNAME Thanks, JC

    Read the article

  • Can't bring NAT to work

    - by user31738
    Hello, I bought a D-link DIR-300 wireless router and i can't bring NAT to work, i have an ssh and http service i need to forward to the internet. My connection is as follows: I have an ADSL connection, i'm using a ADSL ethernet modem connected and working, it doesnt let me put it on bridge mode. I have my router connected to my adsl modem through ethernet, it gets its ip through DHCP (and i'ts always the same) I have a desktop computer running linux with apache and openssh configured and working, it has fixed ip. I configured the NAT in the modem forwarding port 22 from the router ip to the internet. In the router i setup NAT forwarding port 22 from the desktop computer fixed ip to out there. This setup already worked with a fonera i had before, can anyone help me with this or tell me what kind of tests do i need to do? How can i test if the router is forwarding ports correctly before the modem?

    Read the article

  • Approach for monitoring internet backbone traffic volume

    - by Greg Harman
    I'm interested in getting a picture of relative volume across different internet backbones. In particular, I'd like to see how traffic volume over a given route differs over the course of a day or from one day to the next. InternetTrafficReport.com is the closest approximation to this that I've found online, and their approach is to test ping times to a number of key routers from several geographically-dispersed servers. This sounds like one straightforward way to measure, but I don't have several geographically-dispersed servers. Is there a different approach for sampling this type of information from a single server?

    Read the article

  • How to run multiple distros using lvm

    - by Mark
    I've seen quite a few posts around about running multiple distros but not sure they apply to using LVM (and without Windows). I'm using a machine that's about 3 years old. Setup: Intel Core i7 2.8GHz 8GB Ram 1TB SATA HDD At this point, I'd like to install 12.10 and Mint 14, leaving the option open to install additional distros down the road. I could be way off, but I'm thinking about creating at least 2 primary /boot partitions (1 for 12.10 and 1 for Mint) and another partition for LVM leaving room for additional /boot partitions. Then creating a VG and separate LVs for Ubuntu 12.10 and Linux Mint 14. I understand I can share partitions between the 2 installs, but I'm only using this for testing and I have tons of space to play with. LVM seemed logical considering I may want to install and test additional distros. I guess I could share the /swap partition across the board without problems, right? I'm unclear about GRUB2. How do I handle the bootloader situation? Install 12.10 and get it running then make changes to grub.cfg after installing Mint? And do I not install GRUB for Mint or do I install it in a different location? Any guidance would be appreciated.

    Read the article

  • Read/WRITE/Verify disk diagnostic tool for Mac OS X?

    - by Spiff
    It seems that there are many tools out there for Mac OS X that test a hard drive for bad blocks by doing a Read/Verify pass. That is, they read a block, then read it a second time, and verify that both reads yielded the same results. I need a tool that does a non-destructive Read/Write/Verify pass. It should read each block, write those same contents back out, and then read it again to verify. That way every block gets written, giving the hard drive a chance to spare out bad blocks. But since the same contents that were just read get written back out, it doesn't destroy data that wasn't already lost. I'm aware of several tools that can do Read/Verify, but I'm not aware of any that do Read/Write/Verify. Are there any tools that do what I want? Unix / open source tools that compile and run on Mac OS X are fair game too.

    Read the article

  • ArchBeat Link-o-Rama for 2012-04-12

    - by Bob Rhubart
    2012 Real World Performance Tour Dates |Performance Tuning | Performance Engineering www.ioug.org Coming to your town: a full day of real world database performance with Tom Kyte, Andrew Holdsworth, and Graham Wood. Rochester, NY - March 8 Los Angeles, CA - April 30 Orange County, CA - May 1 Redwood Shores, CA - May 3 Oracle Technology Network Developer Day: MySQL - New York www.oracle.com Wednesday, May 02, 2012 8:00 AM – 4:30 PM Grand Hyatt New York 109 East 42nd Street, Grand Central Terminal New York, NY 10017 Webcast Series: Data Warehousing Best Practices event.on24.com April 19, 2012 - Best Practices for Workload Management of a Data Warehouse on Oracle Exadata May 10, 2012 - Best Practices for Extreme Data Warehouse Performance on Oracle Exadata Webcast: Untangle Your Business with Oracle Unified SOA and Data Integration event.on24.com Date: Tuesday, April 24, 2012 Time: 10:00 AM PT / 1:00 PM ET Speakers: Mala Narasimharajan - Senior Product Marketing Manager, Oracle Data Integration, Oracle Bruce Tierney - Director of Product Marketing, Oracle SOA Suite, Oracle The Increasing Focus on Architecture (ArchBeat) blogs.oracle.com As a "third wave" of computing, Cloud computing is changing how IT organizations and individuals within those organizations approach the creation of solutions. Updated SOA Documents now available in ITSO Reference Library blogs.oracle.com Nine updated documents have just been added to the IT Strategies from Oracle library, including SOA Practitioner Guides, SOA Reference Architectures, and SOA White Papers and Data Sheets. Access to all documents within the ITSO library is free to those with a free Oracle.com membership. WebLogic JMS Clustering and Spring | Rene van Wijk middlewaremagic.com Oracle ACE Rene van Wijk sets up a WebLogic cluster that includes a JMS environment, which will be used by Spring. Running Built-In Test Simulator with SOA Suite Healthcare 11g in PS4 and PS5 | Shub Lahiri blogs.oracle.com Shub Lahiri shows how the pre-installed simulator that comes with the SOA Suite for Healthcare Integration pack can be used as an external endpoint to generate inbound and outbound HL7 traffic on specified MLLP ports. In the cloud era, let's start calling IT what it is: 'Innovation Team' | Joe McKendrick www.zdnet.com Cloud, the third great shift in 50 years of computing, presents a golden opportunity for IT to get out in front and lead. Thought for the Day "Why do we never have time to do it right, but always have time to do it over?" — Anonymous

    Read the article

  • CodePlex Daily Summary for Tuesday, November 12, 2013

    CodePlex Daily Summary for Tuesday, November 12, 2013Popular ReleasesExport Version History Of SharePoint 2010 List Items to Microsoft Excel.: ExportVersionHistory_R9: Fix for "&" in List TitleSharePoint Client Browser for SharePoint 2010 and 2013: SharePoint 2013 Client Browser v1.1: SharePoint 2013 Client Browser v1.1, released: 11/12/2013 - Added splash screen on startup for better user experience - Added support for Taxonomy (Term Store, Group, Term Set and Terms) - Added limited support for User Profiles, only showing current user profile - Fixed columns issue with Raw Data tab (clearing columns)sb0t v.5: sb0t 5.16 r2: Fixed PM blocking bug. Allow room scribbles by supported clients. Link user list bug hopefully fixed. Ignore scribbles for ignored users. Fixed additional PM bug.NuGet: NuGet 2.7.2: 2.7.2 releaseDomain Oriented N-Layered .NET 4.5: AutoNLayered v1.1.1: - MVC 5 - Entity Framework 6 (Asynchronous architecture - TAP) - Improvements in services, repository, uow...Paint.NET PSD Plugin: 2.4.0: PSDPlugin version 2.4.0 requires Paint.NET 3.5.11. Changes: Multichannel images can now be loaded, with each channel showing up as a separate grayscale layer. More reliable loading of files with layer groups created in Photoshop CS5 and CS6. Faster loading of PSD files. Speed improvement of 1.1x to 1.5x on 8-bit images, and about 15% on 32-bit RGB images. More efficient RLE compression, with file sizes reduced by about 4% on average. The PsdFile class can now load and save most PSD...Lib.Web.Mvc & Yet another developer blog: Lib.Web.Mvc 6.3.3: Lib.Web.Mvc is a library which contains some helper classes for ASP.NET MVC such as strongly typed jqGrid helper, XSL transformation HtmlHelper/ActionResult, FileResult with range request support, custom attributes and more. Release contains: Lib.Web.Mvc.dll with xml documentation file Standalone documentation in chm file and change log Library source code Sample application for strongly typed jqGrid helper is available here. Sample application for XSL transformation HtmlHelper/ActionRe...Magick.NET: Magick.NET 6.8.7.501: Magick.NET linked with ImageMagick 6.8.7.5. Breaking changes: - Refactored MagickImageStatistics to prepare for upcoming changes in ImageMagick 7. - Renamed MagickImage.SetOption to SetDefine.Media Companion: Media Companion MC3.587b: Fixed* TV - Locked shows display correctly after refresh * TV - missing episodes display in correct colour for missed or to be aired * TV - Rescrape of Multi-episodes working. * TV - Cache fix where was writing episodes multiple times * TV - Fixed Cache writing missing episodes when Display missing eps was disabled. Revision HistoryGenerate report of user mailbox size for Exchange 2010: Script Download: Script Download http://gallery.technet.microsoft.com/scriptcenter/Generate-report-of-user-e4e9afcaCheck SQL Server a specified database index fragmentation percentage (SQL): Script Download: Script Download http://gallery.technet.microsoft.com/scriptcenter/Check-SQL-Server-a-a5758043Save attachments from multiple selected items in Outlook (VBA): Script Download: Script Download: http://gallery.technet.microsoft.com/scriptcenter/Save-attachments-from-5b6bf54bRemove Windows Store apps in Windows 8: Script Download: Script Download http://gallery.technet.microsoft.com/scriptcenter/Remove-Windows-Store-Apps-a00ef4a4PCSX-Reloaded: 1.9.94: General changes:Support for compressed audio in cue files. ECM support. OS X changes:32-bit support has been dropped Partial French and Hungarian translationsVidCoder: 1.5.12 Beta: Added an option to preserve Created and Last Modified times when converting files. In Options -> Advanced. Added an option to mark an automatically selected subtitle track as "Default". Updated HandBrake core to SVN 5878. Fixed auto passthrough not applying just after switching to it. Fixed bug where preset/profile/tune could disappear when reverting a preset.Toolbox for Dynamics CRM 2011/2013: XrmToolBox (v1.2013.9.25): XrmToolbox improvement Correct changing connection from the status dropdown Tools improvement Updated tool Audit Center (v1.2013.9.10) -> Publish entities Iconator (v1.2013.9.27) -> Optimized asynchronous loading of images and entities MetadataDocumentGenerator (v1.2013.11.6) -> Correct system entities reading with incorrect attribute type Script Manager (v1.2013.9.27) -> Retrieve only custom events SiteMapEditor (v1.2013.11.7) -> Reset of CRM 2013 SiteMap ViewLayoutReplicator (v1.201...Microsoft SQL Server Product Samples: Database: SQL Server 2014 CTP2 In-Memory OLTP Sample, based: This sample showcases the new In-Memory OLTP feature, which is part of SQL Server 2014 CTP2. It shows the new memory-optimized tables and natively-compiled stored procedures, and can be used to show the performance benefit of in-memory OLTP. Installation instructions for the sample are included in the file ‘awinmemsample.doc’, which is part of the download. You can ask a question about this sample at the SQL Server Samples Forum Composite C1 CMS - Open Source on .NET: Composite C1 4.1: Composite C1 4.1 (4.1.5058.34326) Write a review for this release - help us improve, recommend us. Getting started If you are new to Composite C1 and want to install it: http://docs.composite.net/Getting-started What's new in Composite C1 4.1 The following are highlights of major changes since Composite C1 4.0: General user features: Drag-and-drop images and files like PDF and Word documents directly from your own desktop and folders into page content Allow you to install Composite For...CS-Script for Notepad++ (C# intellisense and code execution): Release v1.0.9.0: Implemented Recent Scripts list Added checking for plugin updates from AboutBox Multiple formatting improvements/fixes Implemented selection of the CLR version when preparing distribution package Added project panel button for showing plugin shortcuts list Added 'What's New?' panel Fixed auto-formatting scrolling artifact Implemented navigation to "logical" file (vs. auto-generated) file from output panel To avoid the DLLs getting locked by OS use MSI file for the installation.WPF Extended DataGrid: WPF Extended DataGrid 2.0.0.10 binaries: Now row summaries are updated whenever autofilter value sis modified.New Projectsasmhighlighter fork for Visual Studio 2013: asmhighlighter fork for Visual Studio 2013 Added more customizable colors in Visual Studio 2013->Options->Font and colors Centrafuse Plugin for Mapfactor Navigator: This Centrafuse plugin embeds Mapfactor Navigator as a window into Centrafuse and integrates it as the Navigation Engine.Coded UI Hybrid Test Automation Framework: A blueprint of a Hybrid Coded UI Test Automation Framework which aims to eliminate hardships and increase its adoption across various organizations.Excel add-in for Generalized Jarrow-Rudd option pricing: Excel add-in to call fmsgjr code.FindFileTool: This is the program file has been released Lotte search toolFJYC_PLAN(new): PLANGeneralized Jarrow-Rudd Option Pricing: Perturb cumulants of normal distributions instead of lognormal.Hawk LAN Messenger: My First LAN Messenger Project............ Hydration Kit for App-V 5 Sequencing: This kit builds automated installers for Microsoft Application Virtualization (App-V) 5.0 sequencing environments.MDCC Taxi: Overwrite of the scheduled and ordeder taxis to drive employees of Microsoft Development Centre Copenhagen to the local train in SkodsborgMeetSomeNearbyStranger: This project represents a web service for an android application with the same name.MsgBox: This project contains a message box service locator implementation implemented in C# and WPF.SuperSocket FTP Server: A pure C# FTP Server base on SuperSocketxRM Import-Export Solution Manager for Microsoft Dynamics CRM: This tool provides the ability to automatise the Import-Export of Solutions in Microsoft Dynamics CRM 2011 and 2013, also generating a powershell outcome

    Read the article

  • wamp - Changing PHP version stops server running

    - by James Connor
    I downloaded wamp which works (green icon). However I need to test a site locally in Joomla 1.5 which caused errors when using php 5.3 I believe I need a PHP version lower then this i.e 5.2.x To do this I have gone through PHP - Version - Get More.. and installed a older PHP version. However when I start this PHP version the icon stays on an orange colour and going in localhost doesn't work. I haven't used wamp before so my knowledge of it is limited. If anyone could point me in the right direction it would be greatly appreciated, Regards.

    Read the article

  • configuring DNS zone file and named.conf

    - by tike
    Hi, i was trying to configure DNS in a way so that i could be able to run two domain name. I am able to do one as its preety simple and easy. e.g. test.com now i want to run test1.com in same machine. how do i configure zone file and named.conf. but i have no concept at all. any clue appreciated. thanks Tikejhya.

    Read the article

  • Best practices to avoid Jenkins error: sudo: no tty present and no askpass program specified

    - by s g
    When running any sudo command from Jenkins I get the following error: sudo: no tty present and no askpass program specified I understand that I can solve this by adding a NOPASSWD entry to my /etc/sudoers file which will allow user jenkins to run commands without needing a password. I can add an entry like this: %jenkins ALL=(ALL)NOPASSWD:/home/vts_share/test/sudotest.sh ...but this leads to the following issue: how to avoid specifying full path in sudoers file? I can add an entry like this: %jenkins ALL=NOPASSWD: ALL ...but this allows user jenkins to avoid the password prompt for all commands, which seems a bit unsafe. I'm just curious what my options are here, and if there are any best practices I should consider.

    Read the article

  • Squid - Selective reverse proxy and forward proxy

    - by Dean Smith
    I'd like to setup a squid instance to do selective reverse proxy for a configured list of URLs whilst acting as a normal forward proxy for everything else. We are building new infrastructure, parallel live as it where, and I want to have a proxy that people can use that will force selective traffic into the new platform whilst just acting as a forward proxy for anything else. This makes it very easy for people/systems to test the portions of the new platform we want without having to change too much, just use a proxy address. Is such a setup possible ?

    Read the article

  • How to extract jpegs from a video file using ffmpeg

    - by Andrew Simpson
    I am using C# and ffmpeg. In this scenario I have 279 individual jpegs and i have used ffmpeg to create a AVI file from these images on my client. CMD Line: -f image2 -r 10 -i "C:\000EC902F17F\img%05d.jpg" -s 352x288 -y "C:\1\test.avi" I then upload to my server. CMD Line: -i c:\1\1.avi c:\1\img-%05d.jpg I then extract jpegs from the AVI file. I get 265 jpegs back. Obviously ffmpeg is dropping these frames (most probable) when the avi is 1st created. Is there a way to 'force' to encode using ALL the images I have? Thanks. PS I did not specify any command line option other than the size of the video output. As far as I am aware if none are specified then ffmpeg automatically chooses the best ones?

    Read the article

  • Running CORN job on Ubuntu server for SugarCRM

    - by Logik
    i am pretty inexperienced in Linux.So be descriptive on your answer. My environment :Local Linux server 12.04 hosting Sugar CRM 6.5.2. There is area in sugar CRM called scheduler. I can configured some predefined jobs here. in my case i am trying to run email reminders (ever min/hour/day/month). For this scheduler to be effective, i read some where i need to setup CRON job. So i did some research & finally put following lines in CRONTAB for the root user, as per instructions given in sugarCRM. cd /var/www/crm; php -f cron.php /dev/null 2&1 Well i am creating contracts in my sugarCRM (AOS module) & i want email reminders to be sent for these contracts to the concern person. Now my sugarCRM email is configured correctly & i can send test emails using it. But the CRON + scheduler not giving any result. I can't receive any emails. Then i tried to read /var/log/syslog & it is showing entry for following line each minute. Oct 27 15:03:01 unicomm CRON[28182]: (root) CMD (cd /var/www/crm; php -f cron.php /dev/null 2&1) I've few questions: 1) what does the CRON job line i've added in crontab mean? cd /var/www/crm; php -f cron.php /dev/null 2&1 is not making any sense to me. 2) How am i suppose to get this thing work? I've searched a lot (including SugarCRM forum), but no luck.

    Read the article

  • Faster, Simpler access to Azure Tables with Enzo Azure API

    - by Herve Roggero
    After developing the latest version of Enzo Cloud Backup I took the time to create an API that would simplify access to Azure Tables (the Enzo Azure API). At first, my goal was to make the code simpler compared to the Microsoft Azure SDK. But as it turns out it is also a little faster; and when using the specialized methods (the fetch strategies) it is much faster out of the box than the Microsoft SDK, unless you start creating complex parallel and resilient routines yourself. Last but not least, I decided to add a few extension methods that I think you will find attractive, such as the ability to transform a list of entities into a DataTable. So let’s review each area in more details. Simpler Code My first objective was to make the API much easier to use than the Azure SDK. I wanted to reduce the amount of code necessary to fetch entities, remove the code needed to add automatic retries and handle transient conditions, and give additional control, such as a way to cancel operations, obtain basic statistics on the calls, and control the maximum number of REST calls the API generates in an attempt to avoid throttling conditions in the first place (something you cannot do with the Azure SDK at this time). Strongly Typed Before diving into the code, the following examples rely on a strongly typed class called MyData. The way MyData is defined for the Azure SDK is similar to the Enzo Azure API, with the exception that they inherit from different classes. With the Azure SDK, classes that represent entities must inherit from TableServiceEntity, while classes with the Enzo Azure API must inherit from BaseAzureTable or implement a specific interface. // With the SDK public class MyData1 : TableServiceEntity {     public string Message { get; set; }     public string Level { get; set; }     public string Severity { get; set; } } //  With the Enzo Azure API public class MyData2 : BaseAzureTable {     public string Message { get; set; }     public string Level { get; set; }     public string Severity { get; set; } } Simpler Code Now that the classes representing an Azure Table entity are defined, let’s review the methods that the Azure SDK would look like when fetching all the entities from an Azure Table (note the use of a few variables: the _tableName variable stores the name of the Azure Table, and the ConnectionString property returns the connection string for the Storage Account containing the table): // With the Azure SDK public List<MyData1> FetchAllEntities() {      CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConnectionString);      CloudTableClient tableClient = storageAccount.CreateCloudTableClient();      TableServiceContext serviceContext = tableClient.GetDataServiceContext();      CloudTableQuery<MyData1> partitionQuery =         (from e in serviceContext.CreateQuery<MyData1>(_tableName)         select new MyData1()         {            PartitionKey = e.PartitionKey,            RowKey = e.RowKey,            Timestamp = e.Timestamp,            Message = e.Message,            Level = e.Level,            Severity = e.Severity            }).AsTableServiceQuery<MyData1>();        return partitionQuery.ToList();  } This code gives you automatic retries because the AsTableServiceQuery does that for you. Also, note that this method is strongly-typed because it is using LINQ. Although this doesn’t look like too much code at first glance, you are actually mapping the strongly-typed object manually. So for larger entities, with dozens of properties, your code will grow. And from a maintenance standpoint, when a new property is added, you may need to change the mapping code. You will also note that the mapping being performed is optional; it is desired when you want to retrieve specific properties of the entities (not all) to reduce the network traffic. If you do not specify the properties you want, all the properties will be returned; in this example we are returning the Message, Level and Severity properties (in addition to the required PartitionKey, RowKey and Timestamp). The Enzo Azure API does the mapping automatically and also handles automatic reties when fetching entities. The equivalent code to fetch all the entities (with the same three properties) from the same Azure Table looks like this: // With the Enzo Azure API public List<MyData2> FetchAllEntities() {        AzureTable at = new AzureTable(_accountName, _accountKey, _ssl, _tableName);        List<MyData2> res = at.Fetch<MyData2>("", "Message,Level,Severity");        return res; } As you can see, the Enzo Azure API returns the entities already strongly typed, so there is no need to map the output. Also, the Enzo Azure API makes it easy to specify the list of properties to return, and to specify a filter as well (no filter was provided in this example; the filter is passed as the first parameter).  Fetch Strategies Both approaches discussed above fetch the data sequentially. In addition to the linear/sequential fetch methods, the Enzo Azure API provides specific fetch strategies. Fetch strategies are designed to prepare a set of REST calls, executed in parallel, in a way that performs faster that if you were to fetch the data sequentially. For example, if the PartitionKey is a GUID string, you could prepare multiple calls, providing appropriate filters ([‘a’, ‘b’[, [‘b’, ‘c’[, [‘c’, ‘d[, …), and send those calls in parallel. As you can imagine, the code necessary to create these requests would be fairly large. With the Enzo Azure API, two strategies are provided out of the box: the GUID and List strategies. If you are interested in how these strategies work, see the Enzo Azure API Online Help. Here is an example code that performs parallel requests using the GUID strategy (which executes more than 2 t o3 times faster than the sequential methods discussed previously): public List<MyData2> FetchAllEntitiesGUID() {     AzureTable at = new AzureTable(_accountName, _accountKey, _ssl, _tableName);     List<MyData2> res = at.FetchWithGuid<MyData2>("", "Message,Level,Severity");     return res; } Faster Results With Sequential Fetch Methods Developing a faster API wasn’t a primary objective; but it appears that the performance tests performed with the Enzo Azure API deliver the data a little faster out of the box (5%-10% on average, and sometimes to up 50% faster) with the sequential fetch methods. Although the amount of data is the same regardless of the approach (and the REST calls are almost exactly identical), the object mapping approach is different. So it is likely that the slight performance increase is due to a lighter API. Using LINQ offers many advantages and tremendous flexibility; nevertheless when fetching data it seems that the Enzo Azure API delivers faster.  For example, the same code previously discussed delivered the following results when fetching 3,000 entities (about 1KB each). The average elapsed time shows that the Azure SDK returned the 3000 entities in about 5.9 seconds on average, while the Enzo Azure API took 4.2 seconds on average (39% improvement). With Fetch Strategies When using the fetch strategies we are no longer comparing apples to apples; the Azure SDK is not designed to implement fetch strategies out of the box, so you would need to code the strategies yourself. Nevertheless I wanted to provide out of the box capabilities, and as a result you see a test that returned about 10,000 entities (1KB each entity), and an average execution time over 5 runs. The Azure SDK implemented a sequential fetch while the Enzo Azure API implemented the List fetch strategy. The fetch strategy was 2.3 times faster. Note that the following test hit a limit on my network bandwidth quickly (3.56Mbps), so the results of the fetch strategy is significantly below what it could be with a higher bandwidth. Additional Methods The API wouldn’t be complete without support for a few important methods other than the fetch methods discussed previously. The Enzo Azure API offers these additional capabilities: - Support for batch updates, deletes and inserts - Conversion of entities to DataRow, and List<> to a DataTable - Extension methods for Delete, Merge, Update, Insert - Support for asynchronous calls and cancellation - Support for fetch statistics (total bytes, total REST calls, retries…) For more information, visit http://www.bluesyntax.net or go directly to the Enzo Azure API page (http://www.bluesyntax.net/EnzoAzureAPI.aspx). About Herve Roggero Herve Roggero, Windows Azure MVP, is the founder of Blue Syntax Consulting, a company specialized in cloud computing products and services. Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" from Apress and runs the Azure Florida Association (on LinkedIn: http://www.linkedin.com/groups?gid=4177626). For more information on Blue Syntax Consulting, visit www.bluesyntax.net.

    Read the article

  • Can't get PHP script to run when launched from Aptana Studio

    - by samuel
    I've been using notepad++ for web development (currently html, css, and some php and mysql). I decided to dive into an IDE to see if I could cut down on development time and have more power than notepad++. I grabbed Aptana yesterday and, after harassing it for a few hours, have finally gotten everything up and running. The only problem is that my web pages, which are .php's, do not execute any of their included .php code in the browser. as an example: <?php echo "IT WORKS WOOO"; ?> ought to print IT WORKS WOOO smack in the middle of my blank test page, and yet it does not. I checked to see if I had somehow forgotten how to do an echo statement and ran it on my desktop, using notepad++, and it worked just fine. I have the php plugin, have used the php perspective to write it and have launched it as a php web page, but nothing makes it execute the php within. Any ideas?

    Read the article

  • YSLow says certain CSS are not gzipped

    - by rhand
    YSlow keeps on telling me files like http://www.example.com/wp-content/plugins/q-and-a/css/q-a-plus.css?ver=1.0.6.2 are not gzipped while the gzip test tool at Feed the Bot mentions I am all good: Compressed? Yes Compression type gzip Page size (Bytes) 32,493 Compressed size (Bytes) -1 Saving (Bytes) 32,494 Compression % 100% I added this to my .htaccess: # Gzip <ifModule mod_gzip.c> mod_gzip_on Yes mod_gzip_dechunk Yes mod_gzip_item_include file .(html?|txt|css|js|php|pl)$ mod_gzip_item_include handler ^cgi-script$ mod_gzip_item_include mime ^text/.* mod_gzip_item_include mime ^application/x-javascript.* mod_gzip_item_exclude mime ^image/.* mod_gzip_item_exclude rspheader ^Content-Encoding:.*gzip.* </ifModule> #Deflate <ifmodule mod_deflate.c> AddOutputFilterByType DEFLATE text/text text/html text/plain text/xml text/css application/x-javascript application/javascript </ifmodule> The header for the file mentioned states: CF-Cache-Status MISS CF-RAY 13945df90a9a0c1d-AMS Cache-Control public, max-age=2592000 Connection keep-alive Content-Encoding gzip Content-Type application/javascript Date Thu, 12 Jun 2014 07:34:38 GMT Expires Sat, 12 Jul 2014 07:34:38 GMT Last-Modified Thu, 21 Feb 2013 01:29:18 GMT Server cloudflare-nginx Transfer-Encoding chunked Vary Accept-Encoding Any ideas what I am missing here?

    Read the article

< Previous Page | 579 580 581 582 583 584 585 586 587 588 589 590  | Next Page >