Tumgik
inba123 · 3 years
Text
IEEE
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers (IEEE) is a 501(c)(3) professional association for electronic engineering and electrical engineering (and associated disciplines) with its corporate office in New York City  and its operations center in Piscataway, New Jersey. It was formed in 1963 from the amalgamation of the American Institute of Electrical Engineers and the Institute of Radio Engineers.
Due to its expansion of scope into so many related fields, it is simply referred to by the letters I-E-E-E (pronounced I-triple-E), except on legal business documents. As of 2018, it is the world's largest association of technical professionals with more than 423,000 members in over 160 countries around the world. Its objectives are the educational and technical advancement of electrical and electronic engineering, telecommunications, computer engineering and similar disciplines.
History
Origin 
American Institute of Electrical Engineers § History
Institute of Radio Engineers § History
The IEEE traces its founding to 1884 and the American Institute of Electrical Engineers. In 1912, the rival Institute of Radio Engineers was formed. Although the AIEE was initially larger, the IRE attracted more students and was larger by the mid-1950s. The AIEE and IRE merged in 1963.
The IEEE headquarters is in New York City at 3 Park Ave, but most business is done at the IEEE Operations Center. in Piscataway, NJ, first occupied in 1975.
Growth
The AIEE and the IRE merged to create the IEEE on January 1, 1963. At that time, the combined group had 150,000 members, 93% in the United States. By 1984 there were 250,000 members, 20% of whom were outside the U.S.
The Australian Section of the IEEE existed between 1972 and 1985. After this date, it split into state- and territory-based sections.
As of 2021, IEEE has over 400,000 members in 160 countries, with the U.S. based membership no longer constituting a majority.
Read more
2 notes · View notes
inba123 · 3 years
Text
compiler
compiler
In computing, a compiler is a computer program that translates computer code written in one programming language (the source language) into another language (the target language). The name "compiler" is primarily used for programs that translate source code from a high-level programming language to a lower level language (e.g. assembly language, object code, or machine code) to create an executable program.
There are many different types of compilers which produce output in different useful forms. A cross-compiler produces code for a different CPU or operating system than the one on which the cross-compiler itself runs. A bootstrap compiler is often a temporary compiler, used for compiling a more permanent or better optimised compiler for a language.
History
Some early milestones in the development of compiler technology:
1952: An Autocode compiler developed by Alick Glennie for the Manchester Mark I computer at the University of Manchester is considered by some to be the first compiled programming language.
1952: Grace Hopper's team at Remington Rand wrote the compiler for the A-0 programming language (and coined the term compiler to describe it), although the A-0 compiler functioned more as a loader or linker than the modern notion of a full compiler.
1954–1957: A team led by John Backus at IBM developed FORTRAN which is usually considered the first high-level language. In 1957, they completed a FORTRAN compiler that is generally credited as having introduced the first unambiguously complete compiler.
1959: The Conference on Data Systems Language (CODASYL) initiated development of COBOL. The COBOL design drew on A-0 and FLOW-MATIC. By the early 1960s COBOL was compiled on multiple architectures.
1958–1962: John McCarthy at MIT designed LISP. The symbol processing capabilities provided useful features for artificial intelligence research. In 1962, LISP 1.5 release noted some tools: an interpreter written by Stephen Russell and Daniel J. Edwards, a compiler and assembler written by Tim Hart and Mike Levin.
Compiler construction
A compiler implements a formal transformation from a high-level source program to a low-level target program. Compiler design can define an end-to-end solution or tackle a defined subset that interfaces with other compilation tools e.g. preprocessors, assemblers, linkers. Design requirements include rigorously defined interfaces both internally between compiler components and externally between supporting toolsets.
In the early days, the approach taken to compiler design was directly affected by the complexity of the computer language to be processed, the experience of the person(s) designing it, and the resources available. Resource limitations led to the need to pass through the source code more than once.
A compiler for a relatively simple language written by one person might be a single, monolithic piece of software. However, as the source language grows in complexity the design may be split into a number of interdependent phases. Separate phases provide design improvements that focus development on the functions in the compilation process.
Read more
0 notes
inba123 · 3 years
Text
VGA cable
VGA connector
The Video Graphics Array (VGA) connector is a standard connector used for computer video output. Originating with the 1987 IBM PS/2 and its VGA graphics system, the 15-pin connector went on to become ubiquitous on PCs,as well as many monitors, projectors and high-definition television sets.
Other connectors have been used to carry VGA-compatible signals, such as mini-VGA or BNC, but "VGA connector" typically refers to this design.
Devices continue to be manufactured with VGA connectors, although newer digital interfaces such as DVI, HDMI and DisplayPort are increasingly displacing VGA, and many modern computers and other devices do not include it.
Physical design
The VGA connector is a three-row, 15-pin D-subminiature connector referred to variously as DE-15, HD-15 or DB-15. DE-15 is the most accurate common nomenclature under the D-sub specifications: an "E" size D-sub connector, with 15 pins in three rows.
Electrical design
All VGA connectors carry analog RGBHV (red, green, blue, horizontal sync, vertical sync) video signals. Modern connectors also include VESA DDC pins, for identifying attached display devices.
In both its modern and original variants, VGA utilizes multiple scan rates, so attached devices such as monitors are multisync by necessity.
The VGA interface includes no affordances for hot swapping, the ability to connect or disconnect the output device during operation, although in practice this can be done and usually does not cause damage to the hardware or other problems. The VESA DDC specification does however include a standard for hot-swapping.
Read more
0 notes
inba123 · 3 years
Text
HDMI cable
HDMI
High-Definition Multimedia Interface (HDMI) is a proprietary audio/video interface for transmitting uncompressed video data and compressed or uncompressed digital audio data from an HDMI-compliant source device, such as a display controller, to a compatible computer monitor, video projector, digital television, or digital audio device. HDMI is a digital replacement for analog video standards.
An
AV receiver
for use in
home cinema
from 2012. The upper row of connectors are HDMI.
HDMI implements the EIA/CEA-861 standards, which define video formats and waveforms, transport of compressed and uncompressed LPCM audio, auxiliary data, and implementations of the VESA EDID.  CEA-861 signals carried by HDMI are electrically compatible with the CEA-861 signals used by the Digital Visual Interface (DVI). No signal conversion is necessary, nor is there a loss of video quality when a DVI-to-HDMI adapter is used.The Consumer Electronics Control (CEC) capability allows HDMI devices to control each other when necessary and allows the user to operate multiple devices with one handheld remote control device.
History
The HDMI founders were Hitachi, Panasonic, Philips, Silicon Image, Sony, Thomson, and Toshiba.Digital Content Protection, LLC provides HDCP (which was developed by Intel) for HDMI.HDMI has the support of motion picture producers Fox, Universal, Warner Bros. and Disney, along with system operators DirecTV, EchoStar (Dish Network) and CableLabs.
The HDMI founders began development on HDMI 1.0 on April 16, 2002, with the goal of creating an AV connector that was backward-compatible with DVI. At the time, DVI-HDCP (DVI with HDCP) and DVI-HDTV (DVI-HDCP using the CEA-861-B video standard) were being used on HDTVs. HDMI 1.0 was designed to improve on DVI-HDTV by using a smaller connector and adding audio capability and enhanced Y′CBCR capability and consumer electronics control functions.
The first Authorized Testing Center (ATC), which tests HDMI products, was opened by Silicon Image on June 23, 2003, in California, United States.The first ATC in Japan was opened by Panasonic on May 1, 2004, in Osaka.The first ATC in Europe was opened by Philips on May 25, 2005, in Caen, France. The first ATC in China was opened by Silicon Image on November 21, 2005, in Shenzhen.The first ATC in India was opened by Philips on June 12, 2008, in Bangalore. The HDMI website contains a list of all the ATCs.
Specifications
The HDMI specification defines the protocols, signals, electrical interfaces and mechanical requirements of the standard.The maximum pixel clock rate for HDMI 1.0 is 165 MHz, which is sufficient to allow 1080p and WUXGA (1920×1200) at 60 Hz. HDMI 1.3 increases that to 340 MHz, which allows for higher resolution (such as WQXGA, 2560×1600) across a single digital link. An HDMI connection can either be single-link (type A/C/D) or dual-link (type B) and can have a video pixel rate of 25 MHz to 340 MHz (for a single-link connection) or 25 MHz to 680 MHz (for a dual-link connection). Video formats with rates below 25 MHz (e.g., 13.5 MHz for 480i/NTSC) are transmitted using a pixel-repetition schem.
Read more
0 notes
inba123 · 3 years
Text
WIRELESS NETWORK
Wireless Network
Computer networks that are not connected by cables are called wireless networks. They generally use radio waves for communication between the network nodes. They allow devices to be connected to the network while roaming around within the network coverage.
Types of Wireless Networks
Wireless LANs − Connects two or more network devices using wireless distribution techniques.
Wireless MANs − Connects two or more wireless LANs spreading over a metropolitan area.
Wireless WANs − Connects large areas comprising LANs, MANs and personal networks.
Advantages of Wireless Networks
It provides clutter-free desks due to the absence of wires and cables.
It increases the mobility of network devices connected to the system since the devices need not be connected to each other.
Accessing network devices from any location within the network coverage or Wi-Fi hotspot becomes convenient since laying out cables is not needed.
Installation and setup of wireless networks are easier.
New devices can be easily connected to the existing setup since they needn’t be wired to the present equipment. Also, the number of equipment that can be added or removed to the system can vary considerably since they are not limited by the cable capacity. This makes wireless networks very scalable.
Wireless networks require very limited or no wires. Thus, it reduces the equipment and setup costs.
Examples of wireless networks
Mobile phone networks
Wireless sensor networks
Satellite communication networks
Terrestrial microwave networks
Read more 
0 notes
inba123 · 3 years
Text
Modem
Modem
1. A modem or broadband modem is a hardware device that connects a computer or router to a broadband network. For example, a cable modem and DSL modem are two examples of these types of Modems.
2. Short for modulator/demodulator, a modem is a hardware device that allows a computer to send and receive information over telephone lines. When sending a signal, the device converts ("modulates") digital data to an analog audio signal, and transmits it over a telephone line. Similarly, when an analog signal is received, the modem converts it back ("demodulates" it) to a digital signal.
History of the modem
The first modem, known as the Dataphone, was released by AT&T in 1960. It later became more common for home users when Dennis Hayes and Dale Heatherington released the 80-103A modem in 1977.
Dial-up modems were commonly used by computers to connect to the Internet through the early 2000s until broadband Internet started to be more widely available. As broadband Internet became available and popular, dial-up modems were used by fewer computer users. Today, computers no longer come with a dial-up modem, requiring users who need one to purchase and install it.
What are the speeds of modems?
Modem speed is measured in bps and Kbps, which is the speed the modem can send and receive data. Today, a 56 K (56,000 bps) modem is the fastest solution and speed used with today's dial-up modem.
Earlier speeds of modems included 110 baud, 300 baud, 1200 baud, 2400 baud, 4800 baud, 9600 baud, 14.4k, 28.8k, and 33.6k.
Read more
0 notes
inba123 · 3 years
Text
INTERNET & INTRANET
Internet
It is a worldwide/global system of interconnected computer networks. It uses the standard Internet Protocol (TCP/IP). Every computer in Internet is identified by a unique IP address. IP Address is a unique set of numbers (such as 110.22.33.114) which identifies a computer’s location.
A special computer DNS (Domain Name Server) is used to provide a name to the IP Address so that the user can locate a computer by a name. For example, a DNS server will resolve a name https://www.tutorialspoint.com to a particular IP address to uniquely identify the computer on which this website is hosted.
Intranet
Intranet is the system in which multiple PCs are connected to each other. PCs in intranet are not available to the world outside the intranet. Usually each organization has its own Intranet network and members/employees of that organization can access the computers in their intranet. Each computer in Intranet is also identified by an IP Address which is unique among the computers in that Intranet.
Similarities between Internet and Intranet
Intranet uses the internet protocols such as TCP/IP and FTP.
Intranet sites are accessible via the web browser in a similar way as websites in the internet. However, only members of Intranet network can access intranet hosted sites.
In Intranet, own instant messengers can be used as similar to yahoo messenger/gtalk over the internet.
Differences between Internet and Intranet
Internet is general to PCs all over the world whereas Intranet is specific to few PCs.
Internet provides a wider and better access to websites to a large population, whereas Intranet is restricted.
Internet is not as safe as Intranet. Intranet can be safely privatized as per the need.
Read more
0 notes
inba123 · 3 years
Text
Web 2.0
What is Web 2.0 technology?
When it comes to defining web 2.0. the term means such internet applications which allow sharing and collaboration opportunities to people and help them to express themselves online.
“Web 2.0 is the business revolution in the computer industry caused by the move to the internet as a platform, and any attempt to understand the rules for success on that new platform.”– Tim O’ Reilly.
What are the examples of Web 2.0 applications?
Web 2.0 examples include hosted services (Google Maps),Web applications ( Google Docs, Flickr), Video sharing sites (YouTube), wikis (MediaWiki), blogs (WordPress), social networking (Facebook), folksonomies (Delicious), Microblogging (Twitter), podcasting (Podcast Alley) & content hosting services and many more.
Advantages of Web 2.0:
Available at any time, any place.
Variety of media.
Ease of usage.
Learners can actively be involved in knowledge building.
Can create dynamic learning communities.
Everybody is the author and the editor, every edit that has been made can be tracked.
User-friendly.
Updates in the wiki are immediate and it offers more sources for researchers.
It provides real-time discussion.
Read more
0 notes
inba123 · 3 years
Text
Android
Android
Android is a mobile operating system based on a modified version of the Linux kernel and other open source software, designed primarily for touchscreen mobile devices such as smartphones and tablets. Android is developed by a consortium of developers known as the Open Handset Alliance and commercially sponsored by Google. It was unveiled in November 2007, with the first commercial Android device, the HTC Dream, being launched in September 2008.
Most versions of Android are proprietary. The core components are taken from the Android Open Source Project (AOSP), which is free and open-source software primarily licensed under the Apache License. When Android is actually installed on devices, ability to modify the otherwise FOSS software is usually restricted, either by not providing the corresponding source code or preventing reinstallation through technical measures, rendering the installed version proprietary. Most Android devices ship with additional proprietary software pre-installed, most notably Google Mobile Services  which includes core apps such as Google Chrome, the digital distribution platform Google Play, and associated Google Play Services development platform.
History
See also:
Android version history
First Android logotype (2007–2014)
Second Android logotype (2014–2015)
Third Android logotype (2015–2019)
Fourth Android logotype (2019–present)
Android Inc. was founded in Palo Alto, California, in October 2003 by Andy Rubin, Rich Miner, Nick Sears, and Chris White. Rubin described the Android project as having "tremendous potential in developing smarter mobile devices that are more aware of its owner's location and preferences". The early intentions of the company were to develop an advanced operating system for digital cameras, and this was the basis of its pitch to investors in April 2004. The company then decided that the market for cameras was not large enough for its goals, and five months later it had diverted its efforts and was pitching Android as a handset operating system that would rival Symbian and Microsoft Windows Mobile.
Rubin had difficulty attracting investors early on, and Android was facing eviction from its office space. Steve Perlman, a close friend of Rubin, brought him $10,000 in cash in an envelope, and shortly thereafter wired an undisclosed amount as seed funding. Perlman refused a stake in the company, and has stated "I did it because I believed in the thing, and I wanted to help Andy."
Read more
0 notes
inba123 · 3 years
Text
Pseudocode
Pseudocode
In computer science, pseudocode is a plain language description of the steps in an algorithm or another system. Pseudocode often uses structural conventions of a normal programming language, but is intended for human reading rather than machine reading. It typically omits details that are essential for machine understanding of the algorithm, such as variable declarations and language-specific code. The programming language is augmented with natural language description details, where convenient, or with compact mathematical notation. The purpose of using pseudocode is that it is easier for people to understand than conventional programming language code, and that it is an efficient and environment-independent description of the key principles of an algorithm. It is commonly used in textbooks and scientific publications to document algorithms and in planning of software and other algorithms.
Application
Textbooks and scientific publications related to computer science and numerical computation often use pseudocode in description of algorithms, so that all programmers can understand them, even if they do not all know the same programming languages. In textbooks, there is usually an accompanying introduction explaining the particular conventions in use. The level of detail of the pseudocode may in some cases approach that of formalized general-purpose languages.
Syntax
Pseudocode generally does not actually obey the syntax rules of any particular language; there is no systematic standard form. Some writers borrow style and syntax from control structures from some conventional programming language, although this is discouraged. Some syntax sources include Fortran, Pascal, BASIC, C, C++, Java, Lisp, and ALGOL. Variable declarations are typically omitted. Function calls and blocks of code, such as code contained within a loop, are often replaced by a one-line natural language sentence.
Depending on the writer, pseudocode may therefore vary widely in style, from a near-exact imitation of a real programming language at one extreme, to a description approaching formatted prose at the other.
Readmore
0 notes
inba123 · 3 years
Text
Algorithm
Algorithm
In mathematics and computer science, an algorithm is a finite sequence of well-defined instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. By making use of artificial intelligence, algorithms can perform automated deductions (referred to as automated reasoning) and use mathematical and logical tests to divert the code through various routes (referred to as automated decision-making). Using human characteristics as descriptors of machines in metaphorical ways was already practiced by Alan Turing with terms such as "memory", "search" and "stimulus".
In contrast, a heuristic is an approach to problem solving that may not be fully specified or may not guarantee correct or optimal results, especially in problem domains where there is no well-defined correct or optimal result.
As an effective method, an algorithm can be expressed within a finite amount of space and time, and in a well-defined formal language for calculating a function  Starting from an initial state and initial input (perhaps empty), the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states, eventually producing "output" and terminating at a final ending state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input.
Informal definition
For a detailed presentation of the various points of view on the definition of "algorithm", see
Algorithm characterizations
An informal definition could be "a set of rules that precisely defines a sequence of operations", which would include all computer programs (including programs that do not perform numeric calculations), and (for example) any prescribed bureaucratic procedure or cook-book recipe.
In general, a program is only an algorithm if it stops eventually even though infinite loops may sometimes prove desirable.
A prototypical example of an algorithm is the Euclidean algorithm, which is used to determine the maximum common divisor of two integers; an example (there are others) is described by the flowchart above and as an example in a later section.
Boolos, Jeffrey & 1974, 1999 offer an informal meaning of the word "algorithm"
Formalization
Algorithms are essential to the way computers process data. Many computer programs contain algorithms that detail the specific instructions a computer should perform—in a specific order—to carry out a specified task, such as calculating employees' paychecks or printing students' report cards. Thus, an algorithm can be considered to be any sequence of operations that can be simulated by a Turing-complete system. Authors who assert this thesis include Minsky (1967), Savage (1987) and Gurevich (2000)
Read more
0 notes
inba123 · 3 years
Text
SOFTWARE TESTING
What is Software Testing
Software testing is a process of identifying the correctness of software by considering its all attributes (Reliability, Scalability, Portability, Re-usability, Usability) and evaluating the execution of software components to find the software bugs or errors or defects.Software testing provides an independent view and objective of the software and gives surety of fitness of the software.
 It involves testing of all components under the required services to confirm that whether it is satisfying the specified requirements or not. The process is also providing the client with information about the quality of the software.Testing is mandatory because it will be a dangerous situation if the software fails any of time due to lack of testing. So, without testing software cannot be deployed to the end user.
What is Testing
Testing is a group of techniques to determine the correctness of the application under the predefined script but, testing cannot find all the defect of application. The main intent of testing is to detect failures of the application so that failures can be discovered and corrected. It does not demonstrate that a product functions properly under all conditions but only that it is not working in some specific conditions.
Types of software testing
There are many different types of software tests, each with specific objectives and strategies:
Acceptance testing: Verifying whether the whole system works as intended.
Integration testing: Ensuring that software components or functions operate together.
Unit testing: Validating that each software unit performs as expected. A unit is the smallest testable component of an application.
Functional testing: Checking functions by emulating business scenarios, based on functional requirements. Black-box testing is a common way to verify functions.
Performance testing: Testing how the software performs under different workloads. Load testing, for example, is used to evaluate performance under real-life load conditions.
Regression testing: Checking whether new features break or degrade functionality. Sanity testing can be used to verify menus, functions and commands at the surface level, when there is no time for a full regression test.
Stress testing: Testing how much strain the system can take before it fails. Considered to be a type of non-functional testing.
Usability testing: Validating how well a customer can use a system or web application to complete a task.
Read more
0 notes
inba123 · 3 years
Text
Ethernet
What is Ethernet?
Ethernet is a type of communication protocol that is created at Xerox PARC in 1973 by Robert Metcalfe and others, which connects computers on a network over a wired connection. It is a widely used LAN protocol, which is also known as Alto Aloha Network. It connects computers within the local area network and wide area network. Numerous devices like printers and laptops can be connected by LAN and WAN within buildings, homes, and even small neighborhoods.
Different Types of Ethernet Networks
An Ethernet device with CAT5/CAT6 copper cables is connected to a fiber optic cable through fiber optic media converters. The distance covered by the network is significantly increased by this extension for fiber optic cable. There are some kinds of Ethernet networks, which are discussed below:
Fast Ethernet: This type of Ethernet is usually supported by a twisted pair or CAT5 cable, which has the potential to transfer or receive data at around100 Mbps. They function at 100Base and 10/100Base Ethernet on the fiber side of the link if any device such as a camera, laptop, or other is connected to a network. The fiber optic cable and twisted pair cable are used by fast Ethernet to create communication. The 100BASE-TX, 100BASE-FX, and 100BASE-T4 are the three categories of Fast Ethernet.
Gigabit Ethernet: This type of Ethernet network is an upgrade from Fast Ethernet, which uses fiber optic cable and twisted pair cable to create communication. It can transfer data at a rate of 1000 Mbps or 1Gbps. In modern times, gigabit Ethernet is more common. This network type also uses CAT5e or other advanced cables, which can transfer data at a rate of 10 Gbps.
Advantages of Ethernet
It is not much costly to form an Ethernet network. As compared to other systems of connecting computers, it is relatively inexpensive.
Ethernet network provides high security for data as it uses firewalls in terms of data security.
Also, the Gigabit network allows the users to transmit data at a speed of 1-100Gbps.
In this network, the quality of the data transfer does maintain.
In this network, administration and maintenance are easier.
The latest version of gigabit ethernet and wireless ethernet have the potential to transmit data at the speed of 1-100Gbps
Read more
0 notes
inba123 · 3 years
Text
WordPress
WordPress
WordPress (WP, WordPress.org) is a free and open-source content management system (CMS) written in PHP and paired with a MySQL or MariaDB database. Features include a plugin architecture and a template system, referred to within WordPress as Themes. WordPress was originally created as a blog-publishing system but has evolved to support other web content types including more traditional mailing lists and forums, media galleries, membership sites, learning management systems (LMS) and online stores. One of the most popular content management system solutions in use, WordPress is used by 42.8% of the top 10 million websites as of October 2021.
WordPress was released on May 27, 2003, by its founders, American developer Matt Mullenweg and English developer Mike Little, as a fork of b2/cafelog. The software is released under the GPLv2 (or later) license.
To function, WordPress has to be installed on a web server, either part of an Internet hosting service like WordPress.com or a computer running the software package WordPress.org in order to serve as a network host in its own right.  A local computer may be used for single-user testing and learning purposes.
Overview
A Wordpress blog
"WordPress is a factory that makes webpages" is a core analogy designed to clarify the functions of WordPress: it stores content and enables a user to create and publish webpages, requiring nothing beyond a domain and a hosting service.
WordPress has a web template system using a template processor. Its architecture is a front controller, routing all requests for non-static URIs to a single PHP file which parses the URI and identifies the target page. This allows support for more human-readable permalinks.
Themes
WordPress users may install and switch among many different themes. Themes allow users to change the look and functionality of a WordPress website without altering the core code or site content. Every WordPress website requires at least one theme to be present. Themes may be directly installed using the WordPress "Appearance" administration tool in the dashboard, or theme folders may be copied directly into the themes directory. WordPress themes are generally classified into two categories: free and premium. Many free themes are listed in the WordPress theme directory (also known as the repository), and premium themes are available for purchase from marketplaces and individual WordPress developers. WordPress users may also create and develop their own custom themes.
Read more
0 notes
inba123 · 3 years
Text
Bootstrap
Bootstrap
Bootstrap is a free and open-source CSS framework directed at responsive, mobile-first front-end web development. It contains HTML, CSS and (optionally) JavaScript-based design templates for typography, forms, buttons, navigation, and other interface components.
As of August 2021, Bootstrap is the tenth most starred project on GitHub, with over 152,000 stars, behind freeCodeCamp (over 328,000 stars), Vue.js framework, React library, TensorFlow and others.
Early beginnings
Bootstrap, originally named Twitter Blueprint, was developed by Mark Otto and Jacob Thornton at Twitter as a framework to encourage consistency across internal tools. Before Bootstrap, various libraries were used for interface development, which led to inconsistencies and a high maintenance burden. According to Twitter developer Mark Otto
Bootstrap 2
On January 31, 2012, Bootstrap 2 was released, which added built-in support for Glyphicons, several new components, as well as changes to many of the existing components. This version supports responsive web design, meaning the layout of web pages adjusts dynamically, taking into account the characteristics of the device used (whether desktop, tablet, mobile phone.
Bootstrap 3
On August 19, 2013, Bootstrap 3, was released. It redesigned components to use flat design and a mobile first approach. Bootstrap 3 features new plugin system with namespaced events. Bootstrap 3 dropped Internet Explorer 7 and Firefox 3.6 support, but there is an optional polyfil for these browsers.
Bootstrap 4
Mark Otto announced Bootstrap 4 on October 29, 2014.  The first alpha version of Bootstrap 4 was released on August 19, 2015.The first beta version was released on August 10, 2017.Mark suspended work on Bootstrap 3 on September 6, 2016, to free up time to work on Bootstrap 4. Bootstrap 4 was finalized on January 18, 2018
Bootstrap 5
Bootstrap 5 was officially released on May 5, 2021.
Read more
0 notes
inba123 · 3 years
Text
JavaScript Vs. Angular Js
JavaScript Vs. Angular Js
JavaScript is a lightweight, object-oriented scripting language used to build dynamic HTML pages with interactive effects on a web page that runs in the client's web browser. It's a client-side scripting language that provides interactive effects to web pages to make them more dynamic. On the other hand, Angular JS is a JavaScript-based framework that extends HTML with new features. It is mainly designed to develop dynamic and single-page web applications (SPAs). In this article, we are going to discuss the difference between JavaScript and Angular JS
. But before discussing the differences, we will know about JavaScript and Angular JS.
What is JavaScript?
is a lightweight, object-oriented scripting language that is used to build dynamic HTML pages with interactive effects on a webpage. JavaScript is also commonly used in game development and mobile app development. It is an interpreted scripting language, and its code is only executed in a web browser. We may use
java script
to run the code outside the browser. It's also known as a browser's language. It can be used for both client-side and server-side development. Brendan Eich of Netscape created it, and it was first published in 1995. The language was originally known as LiveScript before being renamed JavaScript. JavaScript's syntax is heavily influenced by the
programming language C
What is Angular Js?
It is an open-source and front-end web development framework with great features or support. Google's Angular team first launched it in 2010. It is a framework that is continually developing and expanding to include better methods for creating web applications. It develops applications primarily using the model view controller (MVC) concept and supports both data binding features and dependency injection.
Since AngularJS is mainly based on HTMLand JavaScript, there is no need to learn another syntax or language. It transforms static HTML into dynamic HTML. It extends HTML's capabilities by adding built-in attributes and components and creating custom attributes using simple JavaScript.
Read more
0 notes
inba123 · 3 years
Text
R (programming language)
R (programming language)
R is a programming language for statistical computing and graphics supported by the R Core Team and the R Foundation for Statistical Computing. Created by statisticians Ross Ihaka and Robert Gentleman, R is used among data miners and statisticians for data analysis and developing statistical software. Users have created packages to augment the functions of the R language.
According to user surveys and studies of scholarly literature databases, R is one of the most commonly used programming language used in data mining.As of March 2022, R ranks 11th in the TIOBE index, a measure of programming language popularity.
The official R software environment is an open-source free software environment within the GNU package, available under the GNU General Public License. It is written primarily in C, Fortran, and R itself (partially self-hosting). Precompiled executables are provided for various operating systems. R has a command line interface. Multiple third-party graphical user interfaces are also available, such as RStudio, an integrated development environment, and Jupyter, a notebook interface.
History
R is an open-source implementation of the S programming language combined with lexical scoping semantics from Scheme, which allow objects to be defined in predetermined blocks rather than the entirety of the code.S was created by Rick Becker, John Chambers, Doug Dunn, Jean McRae, and Judy Schilling at Bell Labs around 1976. Designed for statistical analysis, the language is an interpreted language whose code could be directly run without a compiler. Many programs written for S run unaltered in R. As a dialect of the Lisp language, Scheme was created by Gerald J. Sussman and Guy L. Steele Jr. at MIT around 1975.
In 1991, statisticians Ross Ihaka and Robert Gentleman at the University of Auckland, New Zealand, embarked on an S implementation. It was named partly after the first names of the first two R authors and partly as a play on the name of S.[8] They began publicizing it on the data archive StatLib and the s-news mailing list in August 1993. In 1995, statistician Martin Mächler convinced Ihaka and Gentleman to make R a free and open-source software under the GNU General Public License. The first official release came in June 1995. The first official "stable beta" version (v1.0) was released on 29 February 2000.
Read more
0 notes