جدع تحت النار–قصة مختلفة تماما لبلطجة الداخلية في مصر
انا مين ، و لا احنا مين اساسا ، سؤال ممكن يخطر على بال اي حد بيقرا التدوينة دي و في بالوا انه حيسمع عن الواد حلمبو اللي اتمسك في قسم العامرية و عذبه الباشا ، لكن احب اقولك انه المرة دي حتسمع حاجة تانية خالص ، جديدة و من ناس ممكن تعرفهم و ممكن متعرفهمش بس مش اللي في بالك خالص علشان كده حبيت ابتدي و اقولك احنا مين .
انا اسمي محمود مجدي ، بشتغل شغلانة ترجمتها بالعربي كبير مصممي النظم و الشخص اللي ححكيلك عليه اسمه محمد فوزي و شغال كبير مستشاري النظم في شركة لينك دوت نت. بس برضوا انت لسه متعرفشي احنا مين
انا و محمد و تقريبا ننتمى لمجموعة من حوالي 12 مصري تانيين واخدين شهادة او هي بمعنى ادق جائزة مايكروسوفت تعتبر ارقى جائزة تمنحها مايكروسوفت لمحترفي التكنولوجيا خارج شركتها في العالم و يطلق عليها بالعربي المحترف الاكثر قيمة
Microsoft Most Valuable Professional
الجائزة دي لا تمنح لأي شخص ، بل تمنح بعد مراقبة لهذا الشخص و مستواه العلمي و مشاركاته في المجتمع في جازة تقديرا لخدماته العلمية و مستواه العلمي الذي لم يستغله فقط في مجال عمله بل باشر بخدمه المجتمع العلمي المحيط به مجانا لنشر التكنولوجيا في بلده او مجتمعه و يكون هذا في مجالات و بطرق متعددة
كام واحد واخد الشهادة دي في مصر ، تقريبا 12 ، و انا و محمد الوحيدين في تخصصنا اللي حصلنا عليها انا في مجال
Exchange Server
و محمد في مجال
Virtualization and Data Center
و الاخرين لا يقلوا عنا قيمة او مستوى فنحن لسنا افذاذا او خارقين ، و لكننا وصلنا لمستوى احترافي عالي و لم نقف بل بذلنا كل ما نستطيع لنشر العلم في بلدنا و في العالم ، توصيف الشهادة من على مايكروسوفت كما هو منقول من الموقع: https://mvp.support.microsoft.com/gp/mvpintro
The Microsoft MVP Award Program recognizes and thanks outstanding members of technical communities for their community participation and willingness to help others. The MVP Award is given to exceptional technical community leaders who foster the free and objective exchange of knowledge by actively sharing their real-world expertise with technology users. The MVP Award celebrates the most active community members from around the world who provide invaluable online and offline expertise that enriches the community experience and makes a difference in technical communities that feature Microsoft products.
MVPs are a select group of experts representing technology’s best and brightest people who share a commitment to community. While MVPs come from many backgrounds and a wide range of technical communities, they share a passion for technology and a demonstrated willingness to help others. MVPs do this by writing books and articles, managing Web sites, maintaining blogs, participating in user groups, hosting and contributing chats, presenting at events and training sessions, and answering questions in technical newsgroups, forums, or message boards.
Microsoft MVPs are an amazing group of individuals. By sharing their knowledge and experiences and providing objective feedback, MVPs help people solve problems and discover new capabilities. It gives us great pleasure to recognize and award MVPs as our way of saying thank you for their demonstrated commitment to helping others in technical communities worldwide.
ربنا يعلم انه لم نبغى سوى الاجتهاد و رفع المستوى ، نحن النخبة و لكن لم نقول بل فعلنا و لسنين في صمت ، لم نطلب مالا و لا جاها و ربما لا يعرفنا الكثيرين فنحن غالبا نشارك باسماء مجهولة او مستعارة و ان شاركنا باسماءنا الحقيقية فربما لو رأيتنا في الشارع لا تعرفنا ، و لكننا و بكل فخر ، النخبة
انا بقول كده و كلي فخر و ليس استكبار ، فانا فخور اني ارفع اسم بلدي و اساهم في تنمية مجتمعي ، تم هذا و على مدار سنين و سنين و لم اخذ و لم اطلب و ان اطلب مالا ، فانا اخدم بلدي و ابناء وطني و كذلك يفعل محمد و باقي المجموعة فهم لا يقلوا عنا مكانة و علما
شاهدني و انا و محمد نتحدث من سنة في مؤتمر مايكروسوفت الاكبر في مصر
Transition from Exchange 2003/2007 to Exchange 2010 – Arabic Session – Microsoft OpenDoors – Egypt 2010 – Part1 from mahmoud magdy on Vimeo.
لم هذه المقدمة ، لانه عندما يتم اعتقال محمد و لمجرد انه تشابه اسماء و تلفيق تهمة الاتجار بالمخدرات له بدون اي ذنب او تمييز ، فهذه جريمة لا تغتفر اساسا في حق اي بني ادم و لا تغتفر مرات و مرات لانها لم تميز من هو المتهم و لا مكانته في مجتمعه
محمد تم القبض عليه اول امبارح و بيقول على تويتر
قوه من مباحث السيده جاءت لالقاء القبض على بدون ذكر للتهمه و الاكتفاء بقول أنهم عايزنى فى كلمتين. الظابط كان محترم و مؤدب. يتبع
اتصلت بمالك و مكانش عندى تفاصيل و كنت شاكك أنها ليها علاقه بقضيه اعتصام القضاه من 2006 علشان ده القضيه الوحيده اللى عنحي و انا مش فاعل مؤخرا
فى الطريق سحبوا الموبيل منى و استنيت نص ساعه لحد ظابط المباحث ما قالى انى ممسوك فى حكم مخدرات و هربت من السجن أثناء الثوره. يتبع
أكتشفت أن الهارب أسمه الرباعى مختلف عنى وهم عارفين كده بس مسكونى علشان يبقى أسمهم مسكوا هارب و المطلوب انى أثبت أنى مش هو. يتبع
الهارب أتحبس و هرب بدون ما يسجلوا أى بيانات عنه غير أسمه مفيش رقم بطاقه او تاريخ ميلاد او أسم أم. قمه العبث و ده سهل أنهم يمسكوا أى اسم زيه
عمرو الامام وصل و تعامل مع الظابط و وصلنا لحل غريب اننا نصور ورق من الباسبور بتاعى علشان اثبت انى كنت مسافر و المباحث تأخذ رقم موبيلى. يتبع
الخلاصه. أى حد مش بيسافر كتير و ملوش علاقات معرض للسجن الخطأ المتعمد علشان الداخليه عايزه تزود أرقام و تبان بتشتغل
ممكن تقرا الحدوتة من على حساب محمد فوزي على توتير https://twitter.com/vfawzi
محمد اتقبض عليه و لو الراجل ده مكانشي مسافر و مكانشي يعرف شوية ناس الله اعلم كان ايه اللي حيحصل ليه ، انا هنا احب ان اعلق كالتالي:
-
كم من عقل و نفس و روح و شخص ظلم في هذا البلد لغياب القانون و عدم احترامه
-
انا لم يكن لي رأي سياسي ابدا و لا عمري نزلت مظاهرة و لا أي حاجة ’ لكن ان ارى واحد من افضل العقول المصرية و هو معرض ان يذهب هباء منثورا و في غياهب الجب لمجرد انه هناك بعض الاشخاص مكسلين يعملوا شغلهم فهو شئ يكسر قلبي و لا استطيع ان اتفهمه او اقبله
-
بغض النظر عن مكانة محمد العلمية و الاحترافية فهذا فعل لا يقره لا عقل و لا دين و لا منطق
انا بحب اقول لكل شخص مسؤول و لكل ظابط لا يخشى الله انكم سوف تحاسبون يوما ما على مااقترفتموه يوما ، فخافوا على هذا البلد و على انفسكم فانتم لا تحموه بل تهدموه
حسبي الله و نعم الوكيل ، حسبي الله و نعم الوكيل
What does it mean to you have your Backup data globally de-duped using Netabckup Appliances?!
Of course De-dupe is a great thing, the first time I realised what is De-duped was 3 years ago when I worked for a NetApp Partner and found out how they do De-Dupe on their SAN storage, I loved the ability to eliminate redundant data from your SAN.
But what does it mean to “globally” De-duped at your backup, and I will tell you later why I placed “globally” between brackets .
I didn’t care much for Backup De-Dupe, to be honest, I knew that De-dupe is cool but those are backups, they can be safely not De-Duped (if this is grammatically correct ), who cares right ?, I didn’t realise how much I was mistaken until 3 weeks ago when I attended the NetBackup Appliances training, as the same question was raised.
The trainer explained an example that blew my mind, I didn’t realise how much saving a company can achieve using De-dupe backup data Globally, how, let us see:
Assuming a company that is operating 20 TB of Data (I made the examples little bit bigger to demonstrate how much saving you are getting), those data could be any type of data (VMs, Files, Mix or anything). let us check the following table for 2 weeks worth of backup data size (2 weeks to demonstrate the effect of full backups):
Run | None De-duped size | De-duped size |
First Week Full Backup | 20 TB | Maybe 10 TB (remember the data is De-duped and expected to see 50 to 60 % size reduction) |
Full Week of Differential Data | 5 TB | 2.5 TB (De-Duped Data, size reduced) |
Second Week Full Backup | 25 TB | maybe 0 or a worth of only 1 day of data, how much is that 100 GB ?!) |
Total | 50 TB | 15 TB |
What?, why is that?, Well because Netbackup Appliances with the De-dupe will see the full backup again as data that can be De-Duped and will be 100% De-duped and will only backup the data that has been changed since the latest incremental backup. (how much is that, it will be for sure much more less than the full backup ).
Note: maybe the example is not fair, maybe your software is using some sort of de-dupe technique, but is it a global de-dupe, do you get the full de-dupe efficiency across all the data ?! do you get it across sites, is it mixed with the replication ?!
There is another edge, there are a lot of backup software that can do De-Dupe, but who can do it globally across all the backup sets that is running within the environment, I think none, all Backup Software do the trick on the Job basis, meaning that data within the single backup Job, Folder or disk is deduped, not globally across all the backup jobs, and ….and across the appliances themselves (DR site scenario or remote Sites with NBU appliances scenario).
I loved the backup De-dupe, I loved them so much, I will start from tomorrow let you see NBU appliances in action, I setup the lab and you will see the NBU Appliances effect starting tomorrow, buckle up and enjoy the ride.
Officially recognized as BackupExec BExpert , Thank you Symantec
I spotted a tweet by Sean Regan referring to a blog post by Matt Stephenson about Symantec’s BExperts program.
The program is still a new program, and similar to Microsoft’s MVP program and VMware’s vExperts program, to recognize the community experts who demonstrated exceptional skills within the Symantec’s and Backup Exec Community (more details could be found here ).
Today, I got the amazing news, Symantec Recognized me as one of the very early (I am according to the forum count No. 20) as a BExpert, yessssssssssssssssssssssssssssss.
The program still new and as I can see it started 5 months ago, however it is a distinguished recognition for my contribution during the past 2 years, I was blogging about Backup Exec and Exchange restoration, I knew that there was a lot of pain around the Exchange 2010 and Backup Exec and I just wanted to help, Also I was doing my best on Experts-Exchange.com on the backupexec section, just doing my best :).
I believe that the most important lessons learnt here, community effort always pays on personal and professional level .
Thank you Symantec for the recognition, I hope to to serve the community more and more.
حقائق غير تاريخية–لماذا سمى الاناناس بهذا الاسم
سمي الاناناس بهذا اسم على اسم الحادثة التي وقعت في افريقيا الوسطى في اوائل القرون الماضية حيث كان هناك قبيلة (بن شفروح) التي كانت تصطاد الغوريلات ، و كان احد رجالها ينصب فخا للغوريلات و لكنها هجمت عليه دفعة واحدة فاخذ يجري ، و عندما شاهده رجال بن شفروح لم يعرفوه من كتر الضرب فظنوا انه غوريلا فلم يجدوا ما يضربوه به وقتها فقطعوا تلك الفاكهة من الشجرة و اخذوا يحذفوها بها، فأخذ يقول لهم (انا ناس) (انا ناس) و في الاخر عرفوه و ضحكوا و قتلوا جميع الغوريلات و اطلقوا اسم (اناناس) على هذه الفاكهة الذيذة
هل تعتقد انه هناك اسم ما لا تعرف ما هو ، ارسلة لي على تويتر
http://www.twitter.com/_busbar
و سأجد لك الحقيقة الغير تاريخية و راءه
Installing ESXi 5.1 & vCenter 5.1 on VirtualBox
This is a probably silly post, but I wanted to create a reference for all of those guys installing ESXi 5.1 in general and using VirtualBox, the steps are the same as ESXi 5.0, so here we go:
ESXi Machine Configuration on VirtualBox:
For the VM, create a Lunix 64-bit machine, and configure it as following:
at least 2100 MB of memory (ESXi installation will not complete if he detects memory below 2 GB):
For the CPU, configure the machine with at least 2 processors, the installation will not continue if have 1 CPU:
You might want to configure acceleration, however you won’t be able to start 64-bit machines on ESXi nested inside VirtualBox, this is because the VB doesn’t virtualize the CPU virtualization capabilities to VMs running inside it (sad).
For the network, make sure to select a network that is detectable by the ESXi installation:
once done, map the ISO file and let us rockNroll .
Installing ESXi 5.1:
The screenshots are direct:
Once done you are on the go, wait for the installation to complete.
Installing vCenter 5.1:
vCenter 5.1 introduce the new SSO thing, in this article, I am choosing to install the Basic SSO/vCenter combination, for the full SSO cluster thing, I will highly recommend http://derek858.blogspot.com/2012/09/vmware-vcenter-51-installation-part-1.html
Let us start with our next, next ok Journey:
Once done with the SSO, setup will prompt you for the service information:
So once installed you are done with your single server installation, congrats….
Configuring Dynamic Access Controls and File Classification-Part4-#winservr 2012 #DAC #microsoft #mvpbuzz
Part1: The Windows Server 2012 new File Server–part 1- Access Condition http://goo.gl/9miY1
Part2: The Windows Server 2012 new File Server–part 2- Install AD RMS http://goo.gl/dRHro
Part3: The new file server part3 using file classification & AD RMS: http://goo.gl/A4JlC
In previous parts we have walked through the new file server features and permissions wizard, Data Classification, AD RMS installation and File Classification and AD RMS integration, in the final part of this series we will take about how to implement a new feature of Active Directory called claim based authentication and utilize it for something called Dynamic Access Control.
but wait a minute, what is the claim based authentication, from this reference: http://www.windowsecurity.com/articles/First-Look-Dynamic-Access-Control-Windows-Server-2012.html
Claims-based authentication relies on a trusted identity provider. The identity provider authenticates the user, rather than every application doing so. The identity provider issues a token to the user, which the user then presents to the application as proof of identity. Identity is based on a set of information that, taken together, identifies a particular entity (such as a user or computer). Each piece of information is referred to as a claim. These claims are contained in the token. The token as a whole has the digital signature of the identity provider to verify the authenticity of the information it contains.
Windows Server 2012 turns claims into Active Directory attributes. These claims can be assigned to users or devices, using the Active Directory Administrative Center (ADAC). The identity provider is the Security Token Service (STS). The claims are stored inside the Kerberos ticket along with the user’s security identifier (SID) and group memberships.
Once the data has been identified and tagged – either automatically, manually or by the application – and the claims tokens have been issued, the centralized policies that you’ve created come into play.
Now you can turn user’s attribute whatever they are, into security controls, now we have the power to control the access to files and set the permissions to files using attributes, we no longer controlled by group permissions only.
With that in mind, you can set the permissions on the files based on department attributes, connecting machine, location or any other attribute in Active Directory and you don’t have to create specific groups for that, also the permissions will be set on the fly, not only that, but you can set the permissions not based on the user’s properties but also based on the device the user is using, you can set the permissions to full control from corporate devices, but readonly from kiosk or non-corporate devices.
Not only that, but you can also include the attributes of the resources that is being accessed in the permissions equation, so you want “on the fly” to examine the resource classification and allow only specific users with specific attributes to access the resource (so files classified of country classification “Egypt” will be accessed by only users who are in country “Egypt” for example).
Dynamic Access Control (DAC) is a new era for permissions, I am blown by the power of DAC and how flexible it is, mixed with AD RMS you can have ultimate control on data within your corporate.
Lab Setup:
We will use the steps described here in this TechNet article: http://technet.microsoft.com/en-us/library/hh846167.aspx#BKMK_1_3 , the steps here are illustration of the steps, and prior parts of the blog series (part 1 to 3) are used as foundation to demonstrate the final environment:
Implementation steps:
the first ting to configure is the claim type, claim types represents what are the data queried in the user/device/resource attribute and then used in the permission evaluation, you want to query about the country, you create a claim type for that, you want to use department you create a claim type for that.
In our Lab we will create a claim type of Department and Country:
to create a claim type open the AD Administrative Center and go to Claim Types, and from the menu select new:
Create a new claim for Department :
and for Country :
In the Country, Supply suggested values (to specify values for the claims as Egypt and Qatar):
Note: By defaults claims are issues to users, if you want to issue it for computers you must select that on the claim
Create a new reference resource property for Claim Country:
Now got to Resource Properties and enable the department claim;
Now let us create a Central Access Rule, This rule will include the template permissions that will be applied when the claims are matched with the rules defined in the CAR:
In the rule, specify the security principle you want to use, in this demo we will grant access to Finance Admins full control and Finance Execs read only access, and this will be applied to all files “resources” that is classified in the Finance Department, we can also go with devices claims and specify the country of this device or any other property that we can to query about the device:
The Final rules will be :
Now create a Central Access Policy that will be applied using GPO to all file servers and the Administrator can select and apply them on individual folders:
In the CAP, include the finance data rule:
No you need to apply this CAP using GPO and make it available to file servers, now create a GPO and link it to the file servers OU:
In the Group Policy Management Editor window, navigate to Computer Configuration, expand Policies, expand Windows Settings, and click Security Settings.
Expand File System, right-click Central Access Policy, and then click Manage Central access policies.
In the Central Access Policies Configuration dialog box, add Finance Data, and then click OK.
You need now to allow the Domain Controllers to issue the Claims to the users, this is done by editing the domain controllers GPO and specify the claims settings:
Open Group Policy Management, click your domain, and then click Domain Controllers.
Right-click Default Domain Controllers Policy, and then click Edit.
In the Group Policy Management Editor window, double-click Computer Configuration, double-click Policies, double-clickAdministrative Templates, double-click System, and then double-click KDC.
Double-click KDC Support for claims, compound authentication and Kerberos armoring. In the KDC Support for claims, compound authentication and Kerberos armoring dialog box, click Enabled and select Supported from the Options drop-down list. (You need to enable this setting to use user claims in central access policies.)
Close Group Policy Management.
Open a command prompt and type gpupdate /force
.
Testing the Configuration:
Going to the file server, and clicking on our finance data file, we can now find the data classification that we specific in the Claims:
Now let us classify the data as Finance Department.
Note: In order to allow DAC permissions to go into play, allow everyone NTFS full control permissions and then DAC will overwrite it, if the user doesn’t have NTFS permissions he will be denied access even if DAC grants him access.
Now checking the permissions on the folder:
going to the Central Policy tab and applying the Finance Data Policy:
now let us examine the effective permissions:
for the Finance Admins:
If the user has no claims (so he is a member of the group but not in the finance department and is not located in Egypt) he will be denied access:
Now, let us specify that he is from Finance Department, no luck, Why?!
This is because he must access the data from a device that has claim type country Egypt:
Now test the Finance Execs Permissions and confirm it is working.
You can test applying this rule also when the following condition is set, and wee what happens:
Note: the above rule will grant use access when his department matches the file classification department, so you can have a giant share from mix of departments and permissions will be granted to files based on users’ departments.
Conclusion:
Mixing DAC with AD RMS and file classification is a powerful mix that helps organizations with the DLP dilemma, and with Windows Server 2012 organization has total control for the first time on the files and data within the files. please try the lab and let me know your feedback
Backup&Restore Exchange 2010 mailbox database or mailbox item using ARCserve R16 #msexchange #arcserve
In my ultimate Journey discovering how to backup and restore Exchange 2010 by every single application on our universe, I blog today about how to do that using CA’s ARCserve r16 SP1.
We will continue using my single Exchange server hen installing ARCserver r16 SP1 and then discovering how to make a backup job to backup Exchange and Restore from our backup.
Installing ARCserve r16 SP1:
There is nothing genius about installing the ARCserve, you possible want to plan ahead for the following:
other than that, the installation itself is no brainer, next, next and ok
Configuring ARCserve r16 Devices:
Once you are finished installing and opening the ARCserve console “Manage”, you will be prompt with a very nice tutorial that walks you through the basic configuration of your ARCserve.
In this step we will configure “Disk device” that we will use for our backup to disk, so from Devices choose launch device configuration:
In the Login Server screen, enter your credentials to login to the server:
In the Login Server choose your login server:
In the Device Configuration screen, choose Windows File System Devices to configure a backup folder (the de-duplication device is a folder that could configured to store multiple backups, the ARCserve then divide the backup to small chunks that is compared and de-duplicated using the proprietary ARCserve algorithm) then click add:
and if you somehow missed the wizard, you can do the same using the device wizard from the administration menu:
Once the Device is configured, we can deploy the Agent and start protecting our Exchange server, you can do that from the administration, and then go to Agent Deployment :
Note: In Order to backup the Exchange server using ARCserve you must installing MAPI CDO, this is a must because unlike Symantec which uses EWS to restore emails, ARCserve using MAPI CDO to backup and restore individual email, also note that MAPI CDO must be installed before installing the ARCserve if you don’t you will get the following error message:
“The request is denied by the agent. The requested agent is not installed.”
When you deploy the agents for the first time, you must specify the ARCserve source to copy the agents from it, once copied you won’t need to do that again and you will be able to proceed with the deployment:
Once copied, you will proceed with the agent deployment, so specify the Login Server:
In the agent installation option and normally you will get the automatic, you might want to choose custom to fine tune the installation options:
In the agent select the agents that needs to be deployed:
In the host selection, you have a nice option here to discover the Exchange servers and deploy the agent to them automatically:
to discover the Exchange infrastructure, Just specify you Domain Controller and credentials and the ARCserve will discover the Exchange server for you, nice!!!:
Backup Exchange 2010 Mailbox Database and Mailboxes using ARCserve:
To Create a backup job, it is so easy, from the Protection & Recovery menu choose Backup:
From the Job Setup Menu select your Job Setup Type:
In the Source, select the Mailbox Database, if you want to recover specific mailboxes or mailox items you must configure the Document Level Type backup, unlike Symantec which uses 1 type of backups to either restore Mailbox Database or Mailbox or Mailbox item, ARCserve uses 2 types of backup (mailbox database backup for mailbox level and Mailbox Document level for Mailboxes and Mailbox items):
In the Schedule, select your scheduling:
In the Destination, select your destination, in my case I will use the folder I already configured previously:
Once all set, click the Submit button to submit the job for run.
Restore the Exchange Mailbox Database or Mailbox items from the ARCserve Backup:
Now you can restore either the Mailbox Database or the Mailbox items, you can go to the Restore section, explore the Exchange infrastructure and either select the Mailbox Database or the Mailbox Items:
Conclusion:
In this Article we have explored the basic ARCserve configuration and how to backup and restore Exchange 2010 Mailbox and Mailboxes using ARCserve. it was easy and sweet although I don’t understand why in ARCserve I have to create 2 jobs and duplicates to backup Mailbox Database and Mailboxes (Document level).
So what is the next product, I don’t know I will be waiting for your suggestions , so let me know so I can blog it.