Home > Computing and Information Technology > Computer science > Artificial intelligence > Neural networks and fuzzy systems > Neural Information Processing: 30th International Conference, ICONIP 2023, Changsha, China, November 20–23, 2023, Proceedings, Part XIV(1968 Communications in Computer and Information Science)
37%
Neural Information Processing: 30th International Conference, ICONIP 2023, Changsha, China, November 20–23, 2023, Proceedings, Part XIV(1968 Communications in Computer and Information Science)

Neural Information Processing: 30th International Conference, ICONIP 2023, Changsha, China, November 20–23, 2023, Proceedings, Part XIV(1968 Communications in Computer and Information Science)

          
5
4
3
2
1

Available


Premium quality
Premium quality
Bookswagon upholds the quality by delivering untarnished books. Quality, services and satisfaction are everything for us!
Easy Return
Easy return
Not satisfied with this product! Keep it in original condition and packaging to avail easy return policy.
Certified product
Certified product
First impression is the last impression! Address the book’s certification page, ISBN, publisher’s name, copyright page and print quality.
Secure Checkout
Secure checkout
Security at its finest! Login, browse, purchase and pay, every step is safe and secured.
Money back guarantee
Money-back guarantee:
It’s all about customers! For any kind of bad experience with the product, get your actual amount back after returning the product.
On time delivery
On-time delivery
At your doorstep on time! Get this book delivered without any delay.
Quantity:
Add to Wishlist

About the Book

The nine-volume set constitutes the refereed proceedings of the 30th International Conference on Neural Information Processing, ICONIP 2023, held in Changsha, China, in November 2023.   The 1274 papers presented in the proceedings set were carefully reviewed and selected from 652 submissions.  The ICONIP conference aims to provide a leading international forum for researchers, scientists, and industry professionals who are working in neuroscience, neural networks, deep learning, and related fields to share their new ideas, progress, and achievements.

Table of Contents:
Applications.- Road Surface Segmentation and Detection Under Extreme Weather Conditions Based on Mask-RCNN.- I-RAFT: Optical Flow Estimation Model Based on Multi-scale Initialization Strategy.- Educational Pattern Guided Self-Knowledge Distillation for Siamese Visual Tracking.- LSiF: Log-Gabor Empowered Siamese Federated Learning for Efficient Obscene Image Classification in the Era of Industry 5.0.- Depth Normalized Stable View Synthesis.- Exploring the Integration of Large Language Models into Automatic Speech Recognition Systems: An Empirical Study.- Image Inpainting with Semantic U-Transformer.- Multi-scale Context Aggregation for Video-based Person Re-identification.- Enhancing LSTM and fusing articles of law for legal text summarization.- Text Spotting of Electrical Diagram Based on Improved PP-OCRv3.- Topic-aware Two-layer Context-enhanced Model for Chinese Discourse Parsing.- A Malicious Code Family Classification Method Based on RGB Images and Lightweight Model.- Research on Relation Extraction Based on BERT with Multifaceted Semantics.- NMPose: Leveraging Normal Maps for 6D Pose Estimation.- BFTracker: A One-Shot Baseline Model with Fusion Similarity Algorithm Towards Real-Time Multi-Object Tracking.- Violence-MFAS: Audio-Visual Violence Detection Using Multimodal Fusion Architecture Search.- DeepLink: Triplet Embedding and Spatio-Temporal Dynamics Learning of Link Representations for Travel Time Estimation.- Reversible Data Hiding in Encrypted Images based on Image Reprocessing and Polymorphic Compression.- All You See Is the Tip of the Iceberg: Distilling Latent Interactions Can Help You Find Treasures.- TRFN: Triple-Receptive-Field Network for Regional-Texture and Holistic-Structure Image Inpainting.- PMFNet: A Progressive Multichannel Fusion Network for Multimodal Sentiment Analysis.- Category-wise Meal Recommendation.- A Data-freeSubstitute Model Training Method for Textual Adversarial Attacks.- Detect Overlapping Community via Graph Neural Network and Topological Potential.- DeFusion: Aerial Image Matching Based on Fusion of Handcrafted and Deep Features.- Unsupervised Fabric Defect Detection Framework based on Knowledge Distillation.- Data Protection and Privacy: Risks and Solutions in The Contentious Era of AI-driven Ad Tech.- Topic Modeling for Short Texts via Adaptive Pólya Urn Dirichlet Multinomial Mixture.- Informative Prompt Learning for Low-shot Commonsense Question Answering via Fine-Grained Redundancy Reduction.- Rethinking unsupervised domain adaptation for nighttime tracking.- A Bi-Directional Optimization Network for De-Obscured 3D High-Fidelity Surface Reconstruction.- Jointly extractive and abstractive training paradigm for text summarization.- A Three-Stage Framework For Event-Event Relation Extraction with Large Language Model.- MEFaceNets: Muti-scale Efficient CNNs for Real-time Face Recognition on Embedded Devices.- Optimal Low-rank QR Decomposition with an Application on RP-TSOD.- EDDVPL: A Web Attribute Extraction Method with Prompt Learning.- CACL:Commonsense-Aware Contrastive Learning for Knowledge Graph Completion.- Graph Attention Hashing via Contrastive Learning for Unsupervised Cross-modal Retrieval.- A Two-Stage Active Learning Algorithm for NLP Based on Feature Mixing.- Botnet Detection Method based on NSA and DRN.- ASRCD: Adaptive Serial Relation-based Model for Cognitive Diagnosis.- PyraBiNet: A Hybrid Semantic Segmentation Network Combining PVT and BiSeNet for deformable objects in indoor environments.- Classification of Hard and Soft Wheat Species using Hyperspectral Imaging and Machine Learning Models.- Mitigation of Voltage Violation for Battery Charging Based on Data-Driven Optimization.                       


Best Sellers


Product Details
  • ISBN-13: 9789819981809
  • Publisher: Springer Verlag, Singapore
  • Publisher Imprint: Springer Verlag, Singapore
  • Height: 235 mm
  • No of Pages: 593
  • Series Title: 1968 Communications in Computer and Information Science
  • Sub Title: 30th International Conference, ICONIP 2023, Changsha, China, November 20–23, 2023, Proceedings, Part XIV
  • Width: 155 mm
  • ISBN-10: 9819981808
  • Publisher Date: 27 Nov 2023
  • Binding: Paperback
  • Language: English
  • Returnable: Y
  • Spine Width: 31 mm
  • Weight: 306 gr


Similar Products

How would you rate your experience shopping for books on Bookswagon?

Add Photo
Add Photo

Customer Reviews

REVIEWS           
Click Here To Be The First to Review this Product
Neural Information Processing: 30th International Conference, ICONIP 2023, Changsha, China, November 20–23, 2023, Proceedings, Part XIV(1968 Communications in Computer and Information Science)
Springer Verlag, Singapore -
Neural Information Processing: 30th International Conference, ICONIP 2023, Changsha, China, November 20–23, 2023, Proceedings, Part XIV(1968 Communications in Computer and Information Science)
Writing guidlines
We want to publish your review, so please:
  • keep your review on the product. Review's that defame author's character will be rejected.
  • Keep your review focused on the product.
  • Avoid writing about customer service. contact us instead if you have issue requiring immediate attention.
  • Refrain from mentioning competitors or the specific price you paid for the product.
  • Do not include any personally identifiable information, such as full names.

Neural Information Processing: 30th International Conference, ICONIP 2023, Changsha, China, November 20–23, 2023, Proceedings, Part XIV(1968 Communications in Computer and Information Science)

Required fields are marked with *

Review Title*
Review
    Add Photo Add up to 6 photos
    Would you recommend this product to a friend?
    Tag this Book
    Read more
    Does your review contain spoilers?
    What type of reader best describes you?
    I agree to the terms & conditions
    You may receive emails regarding this submission. Any emails will include the ability to opt-out of future communications.

    CUSTOMER RATINGS AND REVIEWS AND QUESTIONS AND ANSWERS TERMS OF USE

    These Terms of Use govern your conduct associated with the Customer Ratings and Reviews and/or Questions and Answers service offered by Bookswagon (the "CRR Service").


    By submitting any content to Bookswagon, you guarantee that:
    • You are the sole author and owner of the intellectual property rights in the content;
    • All "moral rights" that you may have in such content have been voluntarily waived by you;
    • All content that you post is accurate;
    • You are at least 13 years old;
    • Use of the content you supply does not violate these Terms of Use and will not cause injury to any person or entity.
    You further agree that you may not submit any content:
    • That is known by you to be false, inaccurate or misleading;
    • That infringes any third party's copyright, patent, trademark, trade secret or other proprietary rights or rights of publicity or privacy;
    • That violates any law, statute, ordinance or regulation (including, but not limited to, those governing, consumer protection, unfair competition, anti-discrimination or false advertising);
    • That is, or may reasonably be considered to be, defamatory, libelous, hateful, racially or religiously biased or offensive, unlawfully threatening or unlawfully harassing to any individual, partnership or corporation;
    • For which you were compensated or granted any consideration by any unapproved third party;
    • That includes any information that references other websites, addresses, email addresses, contact information or phone numbers;
    • That contains any computer viruses, worms or other potentially damaging computer programs or files.
    You agree to indemnify and hold Bookswagon (and its officers, directors, agents, subsidiaries, joint ventures, employees and third-party service providers, including but not limited to Bazaarvoice, Inc.), harmless from all claims, demands, and damages (actual and consequential) of every kind and nature, known and unknown including reasonable attorneys' fees, arising out of a breach of your representations and warranties set forth above, or your violation of any law or the rights of a third party.


    For any content that you submit, you grant Bookswagon a perpetual, irrevocable, royalty-free, transferable right and license to use, copy, modify, delete in its entirety, adapt, publish, translate, create derivative works from and/or sell, transfer, and/or distribute such content and/or incorporate such content into any form, medium or technology throughout the world without compensation to you. Additionally,  Bookswagon may transfer or share any personal information that you submit with its third-party service providers, including but not limited to Bazaarvoice, Inc. in accordance with  Privacy Policy


    All content that you submit may be used at Bookswagon's sole discretion. Bookswagon reserves the right to change, condense, withhold publication, remove or delete any content on Bookswagon's website that Bookswagon deems, in its sole discretion, to violate the content guidelines or any other provision of these Terms of Use.  Bookswagon does not guarantee that you will have any recourse through Bookswagon to edit or delete any content you have submitted. Ratings and written comments are generally posted within two to four business days. However, Bookswagon reserves the right to remove or to refuse to post any submission to the extent authorized by law. You acknowledge that you, not Bookswagon, are responsible for the contents of your submission. None of the content that you submit shall be subject to any obligation of confidence on the part of Bookswagon, its agents, subsidiaries, affiliates, partners or third party service providers (including but not limited to Bazaarvoice, Inc.)and their respective directors, officers and employees.

    Accept

    New Arrivals


    Inspired by your browsing history


    Your review has been submitted!

    You've already reviewed this product!
    ASK VIDYA